Thursday, January 29, 2009

LCNUG: NHibernate

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/lcnug_nhibernate.htm]

Yesterday, Robert Dusek and Hudson Akridge of GFX presented on NHibernate, which is a powerful way to persist data. The meeting was a big success - we had our largest turnout - about 20 people. There was a lot of good dialogue both before and after the presentation. We also announced the progress on the new LCNUG website, including our flourishing job board (6 jobs from 3 different companies so far).

Tuesday, January 27, 2009

How to increase chances of being allowed to research

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/how_to_increase_chances_of_being_allowed_to_research.htm]

For any software project, there's always something new to research. Even if the flood of new technology suddenly freezes, most projects would still struggle just to catch up to the existing technology. While a lot of small to mid-size departments don't have dedicated research teams (or even tasks), here are some ideas to subtly incorporate research items into your schedule.

  • Focus on small, concrete tasks that management cares about. A web app probably cares more about research JQuery or Silverlight than it cares about the WinForms DataGrid, or something ambiguous like "incorporating web best practices" (what does that practically mean?)

  • Emphasize the low-hanging fruit with the highest return - Not all research tasks are equal. A SQL static code analyzer (which benefits everyone on the team) may be far more profitable than some crusade to make sure no-one uses Hungarian notation n C#.

  • Piggyback off of existing assignments. If you're implementing an Aspx page, it may be the time to investigate Ajax, JQuery, or even something smaller like just JSON - you'd essentially have "the wind at your back". You could research an unrelated task, like hosting your  build process on virtual machines, but you'd be doing it all alone, without the support of your current assignments.

  • Focus on just one or two things at a time. You could easily list 50 things to do - new technologies, tools, refactoring, open-source projects, controls to integrate, upgrading your framework, testing, automation, code generation, etc... If you juggle too much, it will all crash and you'll have nothing.

  • Finish what you start; Actually deliver something - "A bird in the hand is worth two in the bush." For many departments, the thinking is it's better to have a weak solution that's completed (and hence usable - i.e you have something), than a powerful solution that's "still in progress" (and hence unusable - i.e. you have nothing.). The workplace is ablaze with buzzwords. Anyone can spew forth buzzwords or suggest grandiose visions, but at the end of the day - management cares about things that are actually done.

  • Work incrementally - Management may initially not allot 4 weeks to research how Ajax benefits your web app, but you could spend a day here integrating it, a day there using an update panel, another day later pulling in the Ajax Control Toolkit. Yes, it's slower, but it's better than nothing.

  • Establish a track record to "earn" bigger opportunities - As you gradually get research items actually completed, you'll become more credible, and will therefore probably be given more opportunity to research bigger tasks. For example, an unknown new-hire may be allowed to "explore" for a day, but a credible senior developer - who's already delivered many successful features - may be allowed to explore a research task for weeks.

Thursday, January 22, 2009

Real Life: Taking the fridge door off

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/real_life_taking_the_fridge_door_off.htm]

 To fix a normal squeaky door is relatively easy - just tap out the axle that joins the hinges, oil that, tap it back in, and... no more squeaky hinge. After the 20th such hinge, I got the hang of it. So, whenever my wife says the hallway doors are squeaking (translation: "please fix them"), I'm looking forward to an easy task. However, the other day she mentioned that the fridge door was squeaking. Now that was an issue. Taking off one hinge at a time for a hallway door is easy - but fridge doors aren't built like hallway doors, so you need to take the entire door off. Taking off the entire fridge door is hard (at least for me). But, alas, there was no other way. So, I got out the necessary tools, and took the entire fridge door off, oiled the door axle, and... it too stopped squeaking! The moral is that, just like in software development, people often need to take one step back before taking two steps forward. Maybe that means throwing away precious code, reading a long article instead of just jumping to a quick solution, writing a unit test harness, or something. The current problem might require you to "take the fridge door off".

Monday, January 19, 2009

Ajax Issue: "The history state must be small enough to not make the url larger than 1024"

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/ajax_issue_the_history_state_must_be_small_enough_to_not_m.htm]

I got a really weird Ajax bug the other day. I was working on an ASP.Net website and got this cryptic error whenever clicking a postbacking button within an Ajax update panel.

Microsoft JScript runtime error: Sys.InvalidOperationException: The history state must be small enough to not make the url larger than 1024

Then, re-clicking a tab, you get another ajax error:

Microsoft JScript runtime error: Object doesn't support this property or method
At this line:

if (element.tagName && (element.tagName.toUpperCase() === "SCRIPT"))

In this specific case, the element tag didn't have the toUpperCase() method. It had worked before, everything seemed strange. To make a long story short, it appeared to be a problem from installing VS SP1.  We had installed the old Ajax toolkit. The new SP updated the System.Web.Extensions.dll assembly in the GAC, which created different script resource files.
 

Old (which worked)New (which did not work)
  • Assembly Ver: 3.5.0.0
  • File Version: 3.5.21022.8
  • C:\MyProject\System.Web.Extensions.dll
  • Assembly ver: 3.5.0.0
  • File Version: 3.5.30729.1
  • C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.5\System.Web.Extensions.dll


It worked on some machines and not on others because some machines had SP1 installed, and others did not. The new version is installed in the GAC, so the web app would always reference the new one. Note that both had the exact same credentials (like assembly version= 3.5.0.0), but different file versions.
 

Eventually, to get things working, we just uninstalled the new one from the GAC. (I guess if we had sufficient time, we'd see how to make the app play nice with the new DLL).  However, this was tricky because we couldn't just do a normal windows uninstall. Running:

Gacutil /uf System.Web.Extensions

Will return:

Assembly could not be uninstalled because it is required by Windows Installer
Number of assemblies uninstalled = 1
Number of failures = 0

This blog explained that that’s essentially a bug, and we can fix it from the registry.

http://blogs.msdn.com/alanshi/archive/2003/12/10/42690.aspx


Go to registry, for this key:

[HKLM\SOFTWARE\Classes\Installer\Assemblies\Global]
for this item:

System.Web.Extensions,version="3.5.0.0",publicKeyToken="31bf3856ad364e35",

processorArchitecture="MSIL",fileVersion="3.5.30729.1",culture="neutral"

and remove its data, so that in the "Edit multi-string" dialogue box, the "Value data" textbox is empty.

Now, you can remove form the GAC by running:

Gacutil /uf System.Web.Extensions

And confirm removal:

Gacutil /l System.Web.Extensions

NOTE – you may need to copy the System.Web.Extensions to your web’s bin folder and recompile the solution in VS.


Lastly, to make all the other websites still work, re-install the old dll:

gacutil /i C:\MyProject\System.Web.Extensions.dll

You should now be able to hit postbacks within Ajax update panels without errors.

Sunday, January 18, 2009

You're not a real dev unless you've read this book...

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/youre_not_a_real_dev_unless_youve_read_this_book.htm]

Every now and then some amazing book comes out. And then comes people who insist that "you're not a real developer" unless you've read that book. I was reminded of this while reading reviews on Amazon for some of the hot books out there. While there are core competencies that every dev should know, there are also a lot of fringe topics, and multiple books on the same topic. And while a lot of these things are valuable, I think such an exclusive approach is damaging because it emphasizes not what you know and can apply ("can you write code with design patterns"), but rather what you've read.

 

For example, of course the GoF design patterns book is phenomenal. However, is it really that bad if someone read the C# translation of it instead (Design Patterns in C# by Steven John Metsker), or even skipped the books and went with purely online tutorials? I'd expect a "senior developer" to know what a design pattern is, recognize the buzzwords, and know how to apply them. However, if they got to that point from a different path then "normal" (?), I think that's okay. Part of the problem is that one cannot read it all, so it effectively encourages bluffing - developers buy classic books and display them on their bookshelf like trophies, and are afraid to let on about their shortcomings for fear of being rejected.

 

In short, I think it's far more effective to offer positive criteria like "developers on this team must be fluent in design patterns, automated testing, and writing clean code", as opposed to exclusive criteria like "you must have read book X".

Thursday, January 15, 2009

Tool: Survey Monkey

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/tool_survey_monkey.htm]

Surveys are a great way to find team consensus and get everyone's opinions - especially anonymous surveys. While you could use blank sheets of paper, another way is to offer a web survey. A great tool for that is Survey Monkey. I've taken several surveys with them before, but what I didn't realize is that you can set up a free account, and start sending out real surveys right away - almost like setting up a free hotmail account. Of course there are limits, and they sell packages for more advanced features and quantity. However, for small and simple online surveys of 20 people, the free product works great.

You could ask your team all sorts of questions, like why don't we write more unit tests?, why do we break the build?, what would it take to work twice as fast?, and the like.

Wednesday, January 14, 2009

Why good-intentioned devs might not write good unit tests

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/why_goodintentioned_devs_might_not_write_good_unit_tests.htm]

I'm a big fan of unit testing. A related question to "How many tests are sufficient?" is "Why don't we write good unit tests?" While I've seen some people attribute it to purely negative things like laziness or dumbness or lack or care for code quality, I think that misses the mark. While sure, there are some devs who don't write tests for those reasons, I think there are tons of other devs who are hard-working, smart, and do care about their work, but still don't write good or sufficient unit tests. Calling these hard-working coworkers "dumb" isn't going to make anything better. Here are some reasons why a good-intentioned developer might not write tests.

  1. I think I already write sufficient unit tests for my code.

  2. I don't have time - the tests take too long to initially write.

  3. I don't have time - the tests take too long to maintain and/or they keep breaking.

  4. The unit tests don't really add value. It's just yet another buzzword. They don't actually catch the real errors. So it's not the best use of my time.

  5. It's so much faster to just (real quick) run through my feature manually because all the context is already there (the data, the web session, the integration with other features, etc...).

  6. My code isn't easily testable - unit tests are great for business logic in C#, but I write code other than C# (SQL, JS), or things that aren't business logic (like UI rendering), or my code is too complex for unit tests.

  7. My code isn't easily testable - there are too many dependencies and limits. For example, I can't even reference an ASP.Net CodeBehind in a unit test.

  8. The tests take too long to run (the full test suite takes about 10 minutes, even without the database tests it still takes 3 minutes).

  9. I write code that already works, so it doesn't require unit tests.

  10. My code is so simple so that it doesn't need tests. For example, I'm not going to test every option in a switch-case.

  11. Sounds great, but I just don't know how to write tests for my code.

Note that I absolutely don't offer these as excuses, but rather as practical ideas to help understand a different perspective so you can improve things. For example, if someone is working on a 2-million line project that takes 5 minutes just to compile, let alone run any sort of test, they might skip running the tests with a "I don't have time" mindset. Yes, I still think it's overall faster to write and run the tests, but at least it helps you understand their perspective so you can try to meet them half way (perhaps improve their machine hardware, split up the solution, split up the tests, etc...). Of, if someone thinks that unit tests don't catch "real errors", then you can have a discussion with concrete examples. Either way, understanding someone's reasons for doing something will help bridge the gap.