Thursday, June 30, 2011

Magic Cards: M12, Sphinx of Uthuun

I play Magic: The Gathering. If you don't know what this is, that's fine; it's a strategy card game where you play cards that represent spells, and you try and defeat your opponents. Definitely nerdy, but the fantasy flavor is great, and it's also an excellent mental challenge for those interested in that sort of thing. I'm always looking for games that make me think and push my brain, and this is definitely one of those. There are lots of similar games, and a lot of them are geared to kids (Pokemon, Yu-Gi-Oh, etc), but MtG was the original, and tends to have an older average age, especially in the tournament scene, which I enjoy.

So the next set is called a 'core' set, and features staple cards plus a bunch of new stuff. It sort of sets the tone for the next year. While originally numbers 4th edition, 5th, etc, after 10th they changed to Magic 2010, the set that shot tons of new life back into the game. We are now up to 2012, or M12.

One of the new cards that caught my eye is Sphinx of Uthuun, a completely impractical (it costs 7, which is enormous in terms of competitive play generally), but has some very interesting effects, and combines two cards into one with a total cost of 10. That alone makes in interesting, and with the average cost of playable spells creeping up to the 5-6 range lately (intentionally; WotC is trying to slow the game down over the long term to make it less 'obvious' who will win just based on opening hands), it could see play somewhere. Definitely in casual formats.

Here are the cards in question:




It's definitely and interesting concept, and I may throw one of these in my Commander deck to play around with. Fact of Fiction is a great card, especially when you can stack your deck before it resolves to make the most of the 'I'm getting at least one of these cards' effect.

References:
Magic: The Gathering
Star City Games, the company that has successfully privatized the tournament scene.

A Small Follow-up on C++ vs C#

My recent post mentioned the fact that game developers use C++ over C# almost exclusively, citing speed as the primary reason.

One of my co-workers sent me this article on the issue: it's an interesting read for those interested in the performance differences of C++, C#, as well as x86 vs x64.

The article concludes by saying a programmer who keeps performance in mind can make their code just as fast if not faster in C# than C++, plus they get the win of smoother development in C#. I'm not sure if I agree with that at face value, but I can see the potential; if you assume (and you can, to some extent) that game developers tend to be some of the sharpest minds in coding, at least for their generation (and I can have a whole article on the nature of game developers, but for now I will settle on a few links), then you can hope they will have the wherewithal to avoid the heavier libraries and other potentially-monolithic code that can creep in through reference libraries and 'high level' architecture. There are programs written to streamline code in C++ to make the most of every clock cycle, I haven't done the research but I assume something similar may exist somewhere in C# land... or maybe that's an untapped market, and I should get to work coding something up. ;-)

Until next time, here are your reference links:
Code Project: C++ vs .Net, the benchmark article.
Ars' article on 'The Death March', the constant crunch-time for game developers.
The now-famous 'EA Spouse' article, detailing the harsh conditions forced on these young minds.

Monday, June 27, 2011

The Pragmatic Programmer, and Musings

As it's been almost a month since my last post, I feel an obligation to at least put something on this page to keep it from stagnating, and hopefully keeping myself motivated to keep up with it.

To that end, I'm going to discuss a fundamental part of my mentality as a programmer, rather than something recent I've been working on. Something to get the juices flowing, so to speak.

As a young programmer, I found myself in a position leaving college that I'm sure many others have; the languages taught in school, and the methodologies, do little to promote coding in a modern environment. Stress is placed on getting things to compile and show off coding concepts like recursion rather than things like code cleanliness and readability. The concept of 'extreme programming' is largely absent from the college experience, or at least, it was from mine.

My first job involved C#, a language I had no experience with. As a Computer Engineering student, I took classes in C++, but my senior level projects were in straight C, and run off of a microprocessor that could handle little else. I was given time before my start date, shipped the latest O'Reilly book on C#, and told 'read this. See you in 2 weeks'.

I immediately loved the language, and felt it threw away a lot of what I found cumbersome and painful in C/C++ (header files, unmanaged third party libraries just to do things like string concats, etc), and the advent of 'true' OO brought, or at least made me feel at the time like it brought, so much clarity and understanding. I briefly entertained going back to school at a later point, this time for game development, and the nail in the coffin for me was the loss of managed code: I certainly CAN code at a lower level, and have in the past, but is it something I want to do on a daily basis? Certainly not, and games are rarely/never written in a higher language than C#: the clock cycles in today's consoles are too valuable to rely on something like the CLR. So that was that.

My first boss was a fan of extreme programming, and taught me it is far better to write code that is human readable. A great baseline to share: if what you are writing requires a comment to explain what it is doing, you might be writing it wrong. Hopefully even someone without programming experience can understand at least the concept of the line:
Ball.Roll(10, DistanceUnits.Feet, Direction.Left);
Certainly better than:
b1.Do(10, "f", 0);
Hopefully you see where I'm coming from with this. Naming is not just a good idea; it can be essential, especially when you have to pick the code back up in a year and make changes. The learning curve on your code is your own responsibility to maintain.

Two things changed my mentality drastically: a program called ReSharper, and a book called "The Pragmatic Programmer".  The former is a wonderful piece of software with dozens of shortcuts and helpers for cleaning up code and writing boiler plate quicker. If you have to choose one single VS tool to always have when writing C# code, this is it. At first I resisted many of the 'auto cleanup' suggestions, then slowly I gave in to them as a saw the good folks over at JetBrains are pretty darn smart, and when the make a suggestion, chances are they are correct and you are, in fact, incorrect for disagreeing with them. Some of it is style and flavor, so I leave those decisions to you, and the program is not perfect (there are, for example, some bugs with ReSharper suggestions in Moq when using parameters with null defaults in mocked calls), so all I can say is give it a shot, consider the suggestions even if you don't agree with them, and it will make your code better over time.

The latter, "The Pragmatic Programmer", is a book full of axioms to keep in mind when writing code. It does a fantastic job of making you think about the responsibilities of your classes, where knowledge should live, how data should be communicated between objects, etc. While the verbiage can be a little 'odd' at times, I found the core concepts to be incredibly helpful. In fact, when I finished it I printed the 3-page bullet list at the end of the book of all the things to keep in mind and stuck it above my monitor. Every now and then, I would give it a quick glance and say 'am I following these concepts?'. Sometimes I was, and sometimes they caused me to go back and revisit code I had previously dubbed 'QA ready'. (Never say 'complete'. Software is never complete. Nothing is sacrosanct, everything can be changed, improved, and rearchitected.)

I wish I could go back and convey the things I know now to the young programmer just out of college. I would love to revisit code that was shipped with glaring issues in architecture, to clean it up and add all those features that at the time were 'impossible' simply because of the way it was put together. I think of the responsibilities on that fledgling programmer, and how the end result is nowhere near what it should have been given the importance of the code. But that is reality; software goes to market with glaring bugs, security holes, and inefficient processing cycles. Sometimes fixes come, other times they do not. Features are given up or compromised in the face of developer limitations. These things happen, and are accepted. No one can be perfect, everyone will make mistakes, and the code will evolve. The biggest sin is the developer who cannot, or worse will not, evolve and better himself, learn from his mistakes, pick up on the lessons his peers and mentors and even random voices on the Internet are whispering to him. Programmers, like the code they write, evolve or die. Languages are born faster than any normal human can master them, and concepts are deprecated before they are even released. It is a daunting and ultimately futile task to try and keep up with it all. The code monkey is young, generally: you become a manager, or an architect, and move away from the rat race. Either way, you leave the code behind, and the mastering of the constant influx of new technologies to the next generation. That's not a requirement of a programmer's career, obviously, but it is the goal of many. Those that do not either become the rare pinnacles of their chosen fields, or stagnate and find the niche markets of large companies who similarly refused to evolve. That's not to say that's a "bad" career choice, because many of these experts of lost technologies can command exorbitant fees for their skills. But is is a move away from the next generation, an acknowledgement of the inability or unwillingness or even lack of interest in adapting to the technologies of tomorrow.

As always, I welcome people to weigh in on my thoughts. Until next time, happy coding!

References:
The Pragmatic Programmer
ReSharper

Friday, June 3, 2011

Tech Ed 2011

My most recent experience outside the state took me to Atlanta, for Tech Ed 2011. I was a little nervous about being in one of the child trafficking centers of the world during 'The Rapture 2011', but decided it was worth the risk. (And yes, my blog will probably be littered with snide little gems like this...)

One of my favorite topics recently is REST, especially as it pertains to WCF. In fact, right now at work we are developing a product that uses RESTful services over a SQL database (with ActiveRecord to access the data) and an MVC 3 (with Razor) front end. Fun stuff. So I jumped on a few of the lectures about the technology to see what we could do to improve the process.

Turns out, one of the fundamental things we were doing wrong was how we handled the structure of the RESTful responses to the users: I'll go over this next part briefly. There are exceptions, but this is a good rule of thumb:

  • When you do a select, either for an individual item or for a list of items, you want to do an http GET.
  • When you do a create for a new item, you want to do an http POST.
  • When you do an update for an existing item, you want to do an http PUT.
  • When you do a delete (something I hate in general, by the way), you want to do an http DELETE.

Ok? Ok. For a brief aside: my first job involved two systems which drastically changed they way I think about deleting records. The first was the DMV system for a US territory. The second was the Title and Registration system for that same territory. I'll get into a fuller description of that in another post, but the short of it is in a system of that kind, you never want to lose a record of anything, ever. Police and courts will be wanting historical information, as in 'was this person, at any time, ever named Osama Bin Laden?'... maybe not that extreme, but you get the idea. So updates are king, and columns for start and end dates, and status changes (and histories on those changes, etc) become replacements for straight-up deletes. I try to carry that mentality with me: historical content is really handy, and unless you are changing state many many times a day, another row in a table isn't going to kill your system. Let the DBAs and Business Owners of the world handle how long to keep data, when to archive it and where, etc.

But back to the issue: RESTful responses for CRUD commands. If you've ever written a REST service and tested it out (say with Fiddler, which I highly recommend as a product for that), you'll notice your create commands come back as 200s. That's fine, but it's not a 'correct' response. What you should send back on a successful create is a 201, plus a link to the new Uri in the header. Luckily .Net 4.0 has this built in as a nice new feature to help you out:

  • WebOperationContext.Current.SetStatusAsCreated(newUri)

This command (which is in System.ServiceModel.Web) automatically does both of these things. The one caveat is you will need to construct that new Uri, but I'm assuming you know the select structure already, since there would be little point building the Create without the Read in your CRUD already, just for testing purposes.

Another fun tidbit: WCF 4 comes with automatic content negotiation, which can be really handy when you want to get your responses in, say, json and not xml, because you are a l33t Android programmer and want to just take your objects and run with them. Most of your headers for http GETs when testing your handy new RESTful services will have a line in them like "accept: application/xml". What you can now do is change that to "accept: application/json" and it will format correctly, without any service-side changes. This is very useful for clients across multiple platforms, which is one of the inherent strengths of REST already: this just makes it even more versatile.

This article is a little informal, I'm still trying to find my 'voice' in this format. Hopefully in the future I'll find the time to structure these and include things like code samples, etc. I welcome any questions or comments, and thanks for reading!

References for this article:
Yes, Atlanta really is this bad
MSDN OutgoingResponse (For setting a 201)
PDF for the specific lecture this article discusses
REST in Practice (O'Reilly book)

Console.WriteLine("Hello World");

So I've decided to put this blog back up.

For those not in the know, I set this up during Blogger.com's "Corrupted Database" issue, and lost everything. While I waited to see if it could be salvaged, I tried to create a backlog of things I wanted to talk about. My plan is to pick this back up and continue posting about interesting coding projects I am working on, both professionally and personally.

About me: I am a C# / BizTalk developer in New England, I am in my late 20s, and my wife and I are expecting our first child in November. This blog is intended to share problems, and sometimes even solutions, that I encounter on a daily basis in my job and in my free time coding.

Let the code fly!