Monday, December 5, 2011

Pathfinding Options

So this will probably be my most 'half-baked' article so far, mainly because it's on a topic I haven't fully fleshed out. At some point I may come back and add my conclusions, but for now, I am typing about the journey as I am taking it, so sorry if I steer people down the wrong path.

I am working on a side project that involves moving pieces around a rather complex game board. I am in serious need of path finding, which happens be a topic we didn't really cover in my college courses (among them a course specifically on game design, which to me feels like a pretty big miss. Then again, I helped write the course structure, so I share some of the blame for that.)

So far, my search has turned up a number of algorithms, all deigned to handle different levels of complexity. It does, however, give me (and you) a great place to state, so I will just post all the links so far and allow you to research yourself.


A*
Wiki
Tutorial (this is where I am starting, but it may not be enough to handle 'teleporting' squares)

Breadth First Search
Wiki

Dijkstra's
Wiki

Good luck, and happy path finding!

Friday, October 28, 2011

Wolf Run Green: My List. Thoughts?

Here is the new list I am looking to put together for the local circuit:



  • Main Deck:
    • 17 Forest
    • 4 Rampant Growth
    • 4 Garruk, Primal Hunter
    • 4 Primeval Titan
    • 4 Birds of Paradise
    • 4 Dungrove Elder
    • 3 Green Sun's Zenith
    • 3 Solemn Simulacrum
    • 3 Beast Within
    • 2 Inkmoth Nexus
    • 2 Llanowar Elves
    • 2 Kessig Wolf Run
    • 2 Mountain
    • 1 Batterskull
    • 1 Wurmcoil Engine
    • 1 Acidic Slime
    • 1 Copperline Gorge
    • 1 Rootbound Crag
    • 1 Slagstorm
  • Sideboard:
    • 3 Thrun, the Last Troll
    • 2 Sword of Feast and Famine
    • 2 Blasphemous Act
    • 2 Ancient Grudge
    • 1 Tree of Redemption
    • 1 Arc Trail
    • 1 Viridian Corrupter
    • 1 Ghost Quarter
    • 1 Gut Shot
    • 1 Acidic Slime
I welcome opinions on the numbers, especially the sideboard! I was torn on the Slime, or if i need something like the fourth Beast Within. Also maindeck, I'm torn on Copperline versus Rootbound: at the end, either tutors off a Titan for the red next turn (provided you aren't getting Inkmoth, etc), but still.

17th at Connecticut 2011s with Red Deck!

So I was the highest placed 'burninator' in States this year. I knocked out 2 other RDW decks on the way to the top, the key being 4 Volshok Refugees and 8 ways to kill their copies of the red dread. I was 5-1 going in to the last round, and we had exactly 128 players... so we had 16 people in contention for Top 8. I lost to the man who eventually won the event. He ran W Humans, games 1 and 2 he went 'turn 1 Champion, turn 2 human /human/swing 3'... Game 1 involved a Angelic I couldn't answer, game 2 a Hero I also couldn't answer, and that was all she wrote.

Since then I've been tweaking the deck, and after running the numbers, I've come up with a list that I am pretty happy with. My biggest divergence from this is my lack of Grim Lavamancer: I would just rather have Spikeshot Elder, I feel Grim just sits around without enough graveyard cards to be effective half the time.

Here is the list I recommend: it is 5 cards off from what I ran at States. In light of recent events I'm planning to move from RDW to Primetime (Primieval Titan)... anything not blue makes me happy. :-)



  • Main Deck
    • 18 Mountain
    • 4 Stromkirk Noble
    • 4 Stormblood Berserker
    • 4 Shrine of Burning Rage
    • 4 Incinerate
    • 3 Arc Trail
    • 3 Brimstone Volley
    • 3 Koth of the Hammer
    • 3 Chandra's Phoenix
    • 2 Hero of Oxid Ridge
    • 3 Rootbound Crag
    • 2 Spikeshot Elder
    • 2 Grim Lavamancer
    • 2 Volt Charge
    • 1 Geistflame
    • 1 Goblin Arsonist
    • 1 Gut Shot
  • Sideboard
    • 4 Volshok Refugee
    • 3 Ancient Grudge
    • 1 Traitorous Blood
    • 1 Arc Trail
    • 1 Hero of Oxid Ridge
    • 1 Perilous Myr
    • 1 Manic Vandal
    • 1 Manabarbs
    • 2 Dismember
Koth comes out VS aggro generally, Hero VS control, mind your Arc Trail/Incinerate ratio depending on the creature quantity and sizes coming at you, and make sure to pull in off color killers VS the mirror (and arguably VS poison for Crusader). Barbs is basically only for Wolf Run.

Good luck slinging spells!

XmlSerializer VS DataContractSerializer

So I am doing a project on the side involving a RESTful WCF Service involving many different types of clients. Recently I was stumped by several issues with the XmlSerializer and the attributes on my data classes: the main issue originated from using WCF to serialize the responses, but not using the DataContractSerializer to deserialize the responses. (Which was clearly a mistake from the start... the lesson here is don't mix and match technologies if you have the tools available native!)

Some fun facts if you find yourself in this predicament (or really, REALLY, want to use XmlSerializer):

  • XmlSerializer will not recognize lists when the complex objects in those lists exist in a different namespace than the list itself, even if you specify this fact on the XmlElement for said list. The same goes for arrays. And give up on loosely types interfaces like 'IEnumerable': the serializer chokes on these.
  • You will be stuck with the yucky 'http://schemas.microsoft.com/2003/10/Serialization/+class namespace' namespace in your xml. This can become VERY inconvenient to your class structure, especially when paired with the first issue.
  • The XmlSerializer will fail if it finds a root node it doesn't understand, but will simply null any inner nodes: make sure to check your entire object, a single element in a complex class can be null without the serializer giving you any warning.
  • Off Topic: One issue I have dealt with a little in other projects, but not for this, is the complete lack of support for serializing multidimensional arrays. One way around it is to turn your array into a single array of complex objects with more in it than just the next 'dimension'. Even a dummy 'int' in the class will work, because it prevents the serializer from trying to 'simplify' the code into a structure it can't handle. (I should note, this recommendation comes from BizTalk experience, not WCF, so it may not apply. But if you are stuck, it's worth trying.)

The simple solution (provided you find this, or some similar article) is to use the serializer intended for WCF, and to stick with the DataContract/DataMember structure of more classic WCF service classes. So long as you specify names and namespaces, and don't care about fine-grained serialization control, everything should work great.

If you look around, you will find lots of recommendations to use 'ResponseFormat = WebMessageFormat.Xml' on your contracts. If you can avoid it, do so - WCF 4 lets clients specify in the request header their return types (an issue I discussed a little in a previous post), and IMO that is a far better way to go, especially if you have any plans of interfacing with clients more at home with JSON (i.e., Android devices).

Happy coding!

References:
http://www.danrigsby.com/blog/index.php/2008/03/07/xmlserializer-vs-datacontractserializer-serialization-in-wcf/
http://social.msdn.microsoft.com/Forums/en-AU/csharpgeneral/thread/dfd587ec-a269-49a9-a1a5-0f69d915c776

Thursday, June 30, 2011

Magic Cards: M12, Sphinx of Uthuun

I play Magic: The Gathering. If you don't know what this is, that's fine; it's a strategy card game where you play cards that represent spells, and you try and defeat your opponents. Definitely nerdy, but the fantasy flavor is great, and it's also an excellent mental challenge for those interested in that sort of thing. I'm always looking for games that make me think and push my brain, and this is definitely one of those. There are lots of similar games, and a lot of them are geared to kids (Pokemon, Yu-Gi-Oh, etc), but MtG was the original, and tends to have an older average age, especially in the tournament scene, which I enjoy.

So the next set is called a 'core' set, and features staple cards plus a bunch of new stuff. It sort of sets the tone for the next year. While originally numbers 4th edition, 5th, etc, after 10th they changed to Magic 2010, the set that shot tons of new life back into the game. We are now up to 2012, or M12.

One of the new cards that caught my eye is Sphinx of Uthuun, a completely impractical (it costs 7, which is enormous in terms of competitive play generally), but has some very interesting effects, and combines two cards into one with a total cost of 10. That alone makes in interesting, and with the average cost of playable spells creeping up to the 5-6 range lately (intentionally; WotC is trying to slow the game down over the long term to make it less 'obvious' who will win just based on opening hands), it could see play somewhere. Definitely in casual formats.

Here are the cards in question:




It's definitely and interesting concept, and I may throw one of these in my Commander deck to play around with. Fact of Fiction is a great card, especially when you can stack your deck before it resolves to make the most of the 'I'm getting at least one of these cards' effect.

References:
Magic: The Gathering
Star City Games, the company that has successfully privatized the tournament scene.

A Small Follow-up on C++ vs C#

My recent post mentioned the fact that game developers use C++ over C# almost exclusively, citing speed as the primary reason.

One of my co-workers sent me this article on the issue: it's an interesting read for those interested in the performance differences of C++, C#, as well as x86 vs x64.

The article concludes by saying a programmer who keeps performance in mind can make their code just as fast if not faster in C# than C++, plus they get the win of smoother development in C#. I'm not sure if I agree with that at face value, but I can see the potential; if you assume (and you can, to some extent) that game developers tend to be some of the sharpest minds in coding, at least for their generation (and I can have a whole article on the nature of game developers, but for now I will settle on a few links), then you can hope they will have the wherewithal to avoid the heavier libraries and other potentially-monolithic code that can creep in through reference libraries and 'high level' architecture. There are programs written to streamline code in C++ to make the most of every clock cycle, I haven't done the research but I assume something similar may exist somewhere in C# land... or maybe that's an untapped market, and I should get to work coding something up. ;-)

Until next time, here are your reference links:
Code Project: C++ vs .Net, the benchmark article.
Ars' article on 'The Death March', the constant crunch-time for game developers.
The now-famous 'EA Spouse' article, detailing the harsh conditions forced on these young minds.

Monday, June 27, 2011

The Pragmatic Programmer, and Musings

As it's been almost a month since my last post, I feel an obligation to at least put something on this page to keep it from stagnating, and hopefully keeping myself motivated to keep up with it.

To that end, I'm going to discuss a fundamental part of my mentality as a programmer, rather than something recent I've been working on. Something to get the juices flowing, so to speak.

As a young programmer, I found myself in a position leaving college that I'm sure many others have; the languages taught in school, and the methodologies, do little to promote coding in a modern environment. Stress is placed on getting things to compile and show off coding concepts like recursion rather than things like code cleanliness and readability. The concept of 'extreme programming' is largely absent from the college experience, or at least, it was from mine.

My first job involved C#, a language I had no experience with. As a Computer Engineering student, I took classes in C++, but my senior level projects were in straight C, and run off of a microprocessor that could handle little else. I was given time before my start date, shipped the latest O'Reilly book on C#, and told 'read this. See you in 2 weeks'.

I immediately loved the language, and felt it threw away a lot of what I found cumbersome and painful in C/C++ (header files, unmanaged third party libraries just to do things like string concats, etc), and the advent of 'true' OO brought, or at least made me feel at the time like it brought, so much clarity and understanding. I briefly entertained going back to school at a later point, this time for game development, and the nail in the coffin for me was the loss of managed code: I certainly CAN code at a lower level, and have in the past, but is it something I want to do on a daily basis? Certainly not, and games are rarely/never written in a higher language than C#: the clock cycles in today's consoles are too valuable to rely on something like the CLR. So that was that.

My first boss was a fan of extreme programming, and taught me it is far better to write code that is human readable. A great baseline to share: if what you are writing requires a comment to explain what it is doing, you might be writing it wrong. Hopefully even someone without programming experience can understand at least the concept of the line:
Ball.Roll(10, DistanceUnits.Feet, Direction.Left);
Certainly better than:
b1.Do(10, "f", 0);
Hopefully you see where I'm coming from with this. Naming is not just a good idea; it can be essential, especially when you have to pick the code back up in a year and make changes. The learning curve on your code is your own responsibility to maintain.

Two things changed my mentality drastically: a program called ReSharper, and a book called "The Pragmatic Programmer".  The former is a wonderful piece of software with dozens of shortcuts and helpers for cleaning up code and writing boiler plate quicker. If you have to choose one single VS tool to always have when writing C# code, this is it. At first I resisted many of the 'auto cleanup' suggestions, then slowly I gave in to them as a saw the good folks over at JetBrains are pretty darn smart, and when the make a suggestion, chances are they are correct and you are, in fact, incorrect for disagreeing with them. Some of it is style and flavor, so I leave those decisions to you, and the program is not perfect (there are, for example, some bugs with ReSharper suggestions in Moq when using parameters with null defaults in mocked calls), so all I can say is give it a shot, consider the suggestions even if you don't agree with them, and it will make your code better over time.

The latter, "The Pragmatic Programmer", is a book full of axioms to keep in mind when writing code. It does a fantastic job of making you think about the responsibilities of your classes, where knowledge should live, how data should be communicated between objects, etc. While the verbiage can be a little 'odd' at times, I found the core concepts to be incredibly helpful. In fact, when I finished it I printed the 3-page bullet list at the end of the book of all the things to keep in mind and stuck it above my monitor. Every now and then, I would give it a quick glance and say 'am I following these concepts?'. Sometimes I was, and sometimes they caused me to go back and revisit code I had previously dubbed 'QA ready'. (Never say 'complete'. Software is never complete. Nothing is sacrosanct, everything can be changed, improved, and rearchitected.)

I wish I could go back and convey the things I know now to the young programmer just out of college. I would love to revisit code that was shipped with glaring issues in architecture, to clean it up and add all those features that at the time were 'impossible' simply because of the way it was put together. I think of the responsibilities on that fledgling programmer, and how the end result is nowhere near what it should have been given the importance of the code. But that is reality; software goes to market with glaring bugs, security holes, and inefficient processing cycles. Sometimes fixes come, other times they do not. Features are given up or compromised in the face of developer limitations. These things happen, and are accepted. No one can be perfect, everyone will make mistakes, and the code will evolve. The biggest sin is the developer who cannot, or worse will not, evolve and better himself, learn from his mistakes, pick up on the lessons his peers and mentors and even random voices on the Internet are whispering to him. Programmers, like the code they write, evolve or die. Languages are born faster than any normal human can master them, and concepts are deprecated before they are even released. It is a daunting and ultimately futile task to try and keep up with it all. The code monkey is young, generally: you become a manager, or an architect, and move away from the rat race. Either way, you leave the code behind, and the mastering of the constant influx of new technologies to the next generation. That's not a requirement of a programmer's career, obviously, but it is the goal of many. Those that do not either become the rare pinnacles of their chosen fields, or stagnate and find the niche markets of large companies who similarly refused to evolve. That's not to say that's a "bad" career choice, because many of these experts of lost technologies can command exorbitant fees for their skills. But is is a move away from the next generation, an acknowledgement of the inability or unwillingness or even lack of interest in adapting to the technologies of tomorrow.

As always, I welcome people to weigh in on my thoughts. Until next time, happy coding!

References:
The Pragmatic Programmer
ReSharper

Friday, June 3, 2011

Tech Ed 2011

My most recent experience outside the state took me to Atlanta, for Tech Ed 2011. I was a little nervous about being in one of the child trafficking centers of the world during 'The Rapture 2011', but decided it was worth the risk. (And yes, my blog will probably be littered with snide little gems like this...)

One of my favorite topics recently is REST, especially as it pertains to WCF. In fact, right now at work we are developing a product that uses RESTful services over a SQL database (with ActiveRecord to access the data) and an MVC 3 (with Razor) front end. Fun stuff. So I jumped on a few of the lectures about the technology to see what we could do to improve the process.

Turns out, one of the fundamental things we were doing wrong was how we handled the structure of the RESTful responses to the users: I'll go over this next part briefly. There are exceptions, but this is a good rule of thumb:

  • When you do a select, either for an individual item or for a list of items, you want to do an http GET.
  • When you do a create for a new item, you want to do an http POST.
  • When you do an update for an existing item, you want to do an http PUT.
  • When you do a delete (something I hate in general, by the way), you want to do an http DELETE.

Ok? Ok. For a brief aside: my first job involved two systems which drastically changed they way I think about deleting records. The first was the DMV system for a US territory. The second was the Title and Registration system for that same territory. I'll get into a fuller description of that in another post, but the short of it is in a system of that kind, you never want to lose a record of anything, ever. Police and courts will be wanting historical information, as in 'was this person, at any time, ever named Osama Bin Laden?'... maybe not that extreme, but you get the idea. So updates are king, and columns for start and end dates, and status changes (and histories on those changes, etc) become replacements for straight-up deletes. I try to carry that mentality with me: historical content is really handy, and unless you are changing state many many times a day, another row in a table isn't going to kill your system. Let the DBAs and Business Owners of the world handle how long to keep data, when to archive it and where, etc.

But back to the issue: RESTful responses for CRUD commands. If you've ever written a REST service and tested it out (say with Fiddler, which I highly recommend as a product for that), you'll notice your create commands come back as 200s. That's fine, but it's not a 'correct' response. What you should send back on a successful create is a 201, plus a link to the new Uri in the header. Luckily .Net 4.0 has this built in as a nice new feature to help you out:

  • WebOperationContext.Current.SetStatusAsCreated(newUri)

This command (which is in System.ServiceModel.Web) automatically does both of these things. The one caveat is you will need to construct that new Uri, but I'm assuming you know the select structure already, since there would be little point building the Create without the Read in your CRUD already, just for testing purposes.

Another fun tidbit: WCF 4 comes with automatic content negotiation, which can be really handy when you want to get your responses in, say, json and not xml, because you are a l33t Android programmer and want to just take your objects and run with them. Most of your headers for http GETs when testing your handy new RESTful services will have a line in them like "accept: application/xml". What you can now do is change that to "accept: application/json" and it will format correctly, without any service-side changes. This is very useful for clients across multiple platforms, which is one of the inherent strengths of REST already: this just makes it even more versatile.

This article is a little informal, I'm still trying to find my 'voice' in this format. Hopefully in the future I'll find the time to structure these and include things like code samples, etc. I welcome any questions or comments, and thanks for reading!

References for this article:
Yes, Atlanta really is this bad
MSDN OutgoingResponse (For setting a 201)
PDF for the specific lecture this article discusses
REST in Practice (O'Reilly book)

Console.WriteLine("Hello World");

So I've decided to put this blog back up.

For those not in the know, I set this up during Blogger.com's "Corrupted Database" issue, and lost everything. While I waited to see if it could be salvaged, I tried to create a backlog of things I wanted to talk about. My plan is to pick this back up and continue posting about interesting coding projects I am working on, both professionally and personally.

About me: I am a C# / BizTalk developer in New England, I am in my late 20s, and my wife and I are expecting our first child in November. This blog is intended to share problems, and sometimes even solutions, that I encounter on a daily basis in my job and in my free time coding.

Let the code fly!