Monday, January 19, 2009

Ajax Issue: "The history state must be small enough to not make the url larger than 1024"

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/ajax_issue_the_history_state_must_be_small_enough_to_not_m.htm]

I got a really weird Ajax bug the other day. I was working on an ASP.Net website and got this cryptic error whenever clicking a postbacking button within an Ajax update panel.

Microsoft JScript runtime error: Sys.InvalidOperationException: The history state must be small enough to not make the url larger than 1024

Then, re-clicking a tab, you get another ajax error:

Microsoft JScript runtime error: Object doesn't support this property or method
At this line:

if (element.tagName && (element.tagName.toUpperCase() === "SCRIPT"))

In this specific case, the element tag didn't have the toUpperCase() method. It had worked before, everything seemed strange. To make a long story short, it appeared to be a problem from installing VS SP1.  We had installed the old Ajax toolkit. The new SP updated the System.Web.Extensions.dll assembly in the GAC, which created different script resource files.
 

Old (which worked)New (which did not work)
  • Assembly Ver: 3.5.0.0
  • File Version: 3.5.21022.8
  • C:\MyProject\System.Web.Extensions.dll
  • Assembly ver: 3.5.0.0
  • File Version: 3.5.30729.1
  • C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.5\System.Web.Extensions.dll


It worked on some machines and not on others because some machines had SP1 installed, and others did not. The new version is installed in the GAC, so the web app would always reference the new one. Note that both had the exact same credentials (like assembly version= 3.5.0.0), but different file versions.
 

Eventually, to get things working, we just uninstalled the new one from the GAC. (I guess if we had sufficient time, we'd see how to make the app play nice with the new DLL).  However, this was tricky because we couldn't just do a normal windows uninstall. Running:

Gacutil /uf System.Web.Extensions

Will return:

Assembly could not be uninstalled because it is required by Windows Installer
Number of assemblies uninstalled = 1
Number of failures = 0

This blog explained that that’s essentially a bug, and we can fix it from the registry.

http://blogs.msdn.com/alanshi/archive/2003/12/10/42690.aspx


Go to registry, for this key:

[HKLM\SOFTWARE\Classes\Installer\Assemblies\Global]
for this item:

System.Web.Extensions,version="3.5.0.0",publicKeyToken="31bf3856ad364e35",

processorArchitecture="MSIL",fileVersion="3.5.30729.1",culture="neutral"

and remove its data, so that in the "Edit multi-string" dialogue box, the "Value data" textbox is empty.

Now, you can remove form the GAC by running:

Gacutil /uf System.Web.Extensions

And confirm removal:

Gacutil /l System.Web.Extensions

NOTE – you may need to copy the System.Web.Extensions to your web’s bin folder and recompile the solution in VS.


Lastly, to make all the other websites still work, re-install the old dll:

gacutil /i C:\MyProject\System.Web.Extensions.dll

You should now be able to hit postbacks within Ajax update panels without errors.

Sunday, January 18, 2009

You're not a real dev unless you've read this book...

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/youre_not_a_real_dev_unless_youve_read_this_book.htm]

Every now and then some amazing book comes out. And then comes people who insist that "you're not a real developer" unless you've read that book. I was reminded of this while reading reviews on Amazon for some of the hot books out there. While there are core competencies that every dev should know, there are also a lot of fringe topics, and multiple books on the same topic. And while a lot of these things are valuable, I think such an exclusive approach is damaging because it emphasizes not what you know and can apply ("can you write code with design patterns"), but rather what you've read.

 

For example, of course the GoF design patterns book is phenomenal. However, is it really that bad if someone read the C# translation of it instead (Design Patterns in C# by Steven John Metsker), or even skipped the books and went with purely online tutorials? I'd expect a "senior developer" to know what a design pattern is, recognize the buzzwords, and know how to apply them. However, if they got to that point from a different path then "normal" (?), I think that's okay. Part of the problem is that one cannot read it all, so it effectively encourages bluffing - developers buy classic books and display them on their bookshelf like trophies, and are afraid to let on about their shortcomings for fear of being rejected.

 

In short, I think it's far more effective to offer positive criteria like "developers on this team must be fluent in design patterns, automated testing, and writing clean code", as opposed to exclusive criteria like "you must have read book X".

Thursday, January 15, 2009

Tool: Survey Monkey

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/tool_survey_monkey.htm]

Surveys are a great way to find team consensus and get everyone's opinions - especially anonymous surveys. While you could use blank sheets of paper, another way is to offer a web survey. A great tool for that is Survey Monkey. I've taken several surveys with them before, but what I didn't realize is that you can set up a free account, and start sending out real surveys right away - almost like setting up a free hotmail account. Of course there are limits, and they sell packages for more advanced features and quantity. However, for small and simple online surveys of 20 people, the free product works great.

You could ask your team all sorts of questions, like why don't we write more unit tests?, why do we break the build?, what would it take to work twice as fast?, and the like.

Wednesday, January 14, 2009

Why good-intentioned devs might not write good unit tests

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/why_goodintentioned_devs_might_not_write_good_unit_tests.htm]

I'm a big fan of unit testing. A related question to "How many tests are sufficient?" is "Why don't we write good unit tests?" While I've seen some people attribute it to purely negative things like laziness or dumbness or lack or care for code quality, I think that misses the mark. While sure, there are some devs who don't write tests for those reasons, I think there are tons of other devs who are hard-working, smart, and do care about their work, but still don't write good or sufficient unit tests. Calling these hard-working coworkers "dumb" isn't going to make anything better. Here are some reasons why a good-intentioned developer might not write tests.

  1. I think I already write sufficient unit tests for my code.

  2. I don't have time - the tests take too long to initially write.

  3. I don't have time - the tests take too long to maintain and/or they keep breaking.

  4. The unit tests don't really add value. It's just yet another buzzword. They don't actually catch the real errors. So it's not the best use of my time.

  5. It's so much faster to just (real quick) run through my feature manually because all the context is already there (the data, the web session, the integration with other features, etc...).

  6. My code isn't easily testable - unit tests are great for business logic in C#, but I write code other than C# (SQL, JS), or things that aren't business logic (like UI rendering), or my code is too complex for unit tests.

  7. My code isn't easily testable - there are too many dependencies and limits. For example, I can't even reference an ASP.Net CodeBehind in a unit test.

  8. The tests take too long to run (the full test suite takes about 10 minutes, even without the database tests it still takes 3 minutes).

  9. I write code that already works, so it doesn't require unit tests.

  10. My code is so simple so that it doesn't need tests. For example, I'm not going to test every option in a switch-case.

  11. Sounds great, but I just don't know how to write tests for my code.

Note that I absolutely don't offer these as excuses, but rather as practical ideas to help understand a different perspective so you can improve things. For example, if someone is working on a 2-million line project that takes 5 minutes just to compile, let alone run any sort of test, they might skip running the tests with a "I don't have time" mindset. Yes, I still think it's overall faster to write and run the tests, but at least it helps you understand their perspective so you can try to meet them half way (perhaps improve their machine hardware, split up the solution, split up the tests, etc...). Of, if someone thinks that unit tests don't catch "real errors", then you can have a discussion with concrete examples. Either way, understanding someone's reasons for doing something will help bridge the gap.

 

 

Tuesday, January 13, 2009

MSDN Dev Conference - Chicago

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/msdn_dev_conference__chicago.htm]

I was glad to attend the Microsoft Developer Conference yesterday in Chicago. Despite the snow, there was (I'm guessing?) maybe 500 developers. It's always humbling going to these events and seeing so many top developers. Besides the usual star-lineup of speakers, I also get a kick out of meeting other devs/architects in the same situation as I am. While the presentations are good, I see the main purpose of these events is networking and talking to real people face-to-face.

 

My key take-aways:

  • There is absolutely too much for one single person to learn. Doesn't matter how smart you are, or how much time you think you have, you cannot "do it all", therefore a critical skill for any devlead or architect is how to delegate and empower other people to innovate new things.

  • Team Foundation Server looks very interesting. Because we started our product back in .Net 1.0, before TFS existed, we wired up our process with all these open source packages like CruiseControl, now MSBuild (replaced NAnt), some issue-tracking system, SVN, etc... So, we've already got a lot of momentum there, but TFS looks very promising. I got to talk to several people here, like Angela Binkowski, Paul Hacker, and others whose blogs I don't know (sorry), and they made a good case. I'll probably blog about this more later.

  • JQuery rocks.

  • There are a lot of smart developers in Chicago and the midwest region.

  • I'm hoping that some of these people will be future speakers at the LCNUG.

I was also excited to see the Lake County .Net Users Group get a huge plug after Ron Jacob's keynote presentation.

Sunday, January 11, 2009

How many unit tests are sufficient?

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/how_many_unit_tests_are_sufficient.htm]

By this time, every developer probably knows that along with a hundred other tasks, they're also supposed to write unit tests. One common question from devs still new to testing is "how many tests are enough?" I think there's a couple idealistic guidelines (disclaimer: I haven't personally implemented all the ideas below, it's more of a brain dump). My goal here isn't to say "as a developer, you need to add even more tasks to your already-full plate", but rather "here are practical ideas to help you know when you're done."

 

Automated techniques:

  1. Code coverage - Perhaps the most obvious, and automated, thing is code coverage. The team may agree that X% coverage is sufficient, and then have the automated build fail when new code doesn't meet that policy.

  2. Test Lines of Code - Some developers explain that they usually have X% lines of code for their tests as they have for their production "system-under-test" code. I've heard devs mention between 40% and 100% (one line of test code per production code). I'm personally not too sure how well this works out, and if it couldn't more effectively be captured by code coverage.

Manual "code smell" techniques:

  1. The logic is flushed out - Unit tests are great for checking boundary conditions, different code paths, various inputs, etc... If you don't have enough tests to catch the basic logic, then you don't have enough tests.

  2. The logic is documented - Related to the concept above, one of the benefits of unit tests is to document the code, especially the edge case conditions (what happens if this input array is null, what if I pass in a negative int, etc...). Ideally there are sufficient tests such that the common boundary cases are easily exposed and documented for someone who is reviewing the code.

  3. Will the tests catch errors? - Ideally, there is sufficient test coverage such that a test will fail if another developer "accidentally" breaks your code.

  4. Be able to write new tests - Ideally, there would be enough testing infrastructure such that you can write a new test for every logic error that arises. Even if a component has little code coverage. For example, often having to write even just a single unit test will force you to think about the component such that you could write more tests if you had to.

Thursday, January 8, 2009

How to discourage a developer from working overtime

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/how_to_discourage_a_developer_from_working_overtime.htm]

A while back I pondered what it would take to motivate a developer to work overtime. I was thinking about the flipside of that - what would discourage a developer from working overtime?

  • Constantly change the feature on them - This can be like pulling the rug out from under their feet. I saw this all the time in consulting - for some projects, everything was "of absolute importance". People get burnt out and stop being motivated. After all, why waste my evening plowing on a feature, if the whole thing is just going to be scrapped tomorrow at some executive's whim?
  • Assign boring tasks - This speaks for itself.
  • Provide slow hardware - Not having the proper tools to do your job is just demoralizing. Imagine your manager with a slow laptop - would they wait 60 seconds while their machine freezes when they try to send a single email, or wait 10 seconds every time they clicked a new cell in Excel? Of course not, they'd get furious about how such a slow machine prevents them from effectively doing their work. Same thing for developers - every time a laptop freezes when you try compiling, getting source code, or running tests, it just demoralizes and frustrates the developer. Yes, savvy developers can optimize their machine, but at the end of the day, the .Net development environment has certain hardware needs. For example, it's just wasting their time asking a developer to work on a machine with only 1 GB ram, or 5400 rpm hard drive, or a 1 GHz processor. They'll spend idle time throughout the day - constantly losing their rhythm. The manager saves a few hundred bucks, but both demoralizes their developer and diminishes the return of a $100,000 resource (total cost of the developer = salary + benefits + other stuff HR could tell you about). It's an absolutely clueless business model.
  • Never reward positive accomplishments - Management can offer "non-monetary" rewards like verbal affirmation, or allotting schedule time to pursue a promising research project.
  • Waste their time during the normal work day  - If a developer already "wastes" time due to excess meetings, pointless issues, rework from original bad design, or waiting on a slow machine, why would they spend their own evening to "make that time up"  - time that should never have been taken from them in the first place.
  • Assign them to a "sinking ship" project - Some projects are fundamentally screwed - the core architecture is hopelessly lost, or there's already a run-away bug list, or the spec is unstable (or even contradictory). There's little motivation to work on this kind of suicide project.
  • Have them do a task the hard way because the manager won't pay for the proper tools. For example, have a developer spend 100 hours writing an ajax datagrid, when you could just buy third-party controls for much cheaper. Or, have a developer scour through thousands of lines of database plumbing instead of using a code generator or ORM.

The irony of it all is that the rich get richer and the poor get poorer - i.e. A good environment will motivate the developers to work overtime (or at least be more productive during the day), hence getting farther ahead. Whereas a bad environment will constantly demoralize, frustrate, and slow down its developers, thus getting farther behind.

This is just a partial list - anything to add? What sort of things discourage you from working overtime?