<?xml version='1.0' encoding='UTF-8'?><rss xmlns:atom="http://www.w3.org/2005/Atom" xmlns:openSearch="http://a9.com/-/spec/opensearchrss/1.0/" xmlns:blogger="http://schemas.google.com/blogger/2008" xmlns:georss="http://www.georss.org/georss" xmlns:gd="http://schemas.google.com/g/2005" xmlns:thr="http://purl.org/syndication/thread/1.0" version="2.0"><channel><atom:id>tag:blogger.com,1999:blog-1593691839906710481</atom:id><lastBuildDate>Sun, 10 Aug 2025 06:48:22 +0000</lastBuildDate><category>games</category><category>society</category><category>.NET</category><category>education</category><category>hype</category><category>.NUTS</category><category>boxing</category><category>game design</category><category>creative process</category><category>creativity</category><category>design patterns</category><category>enums</category><category>firefox</category><category>generics</category><category>incompetence</category><category>interfaces</category><category>mass effect</category><category>mozilla</category><category>paradigms</category><category>performance</category><category>piracy</category><category>profiling</category><category>C#</category><category>OOAD</category><category>PyCon</category><category>SSADM</category><category>UML</category><category>assassin&#39;s creed</category><category>bad habits</category><category>bug reporting</category><category>collections</category><category>discussion</category><category>dreamfall</category><category>dreamfall chapters</category><category>empathy</category><category>generators</category><category>inheritance</category><category>iterators</category><category>market</category><category>mediocrity</category><category>polymorphism</category><category>programming languages</category><category>punishment</category><category>rant</category><category>reflection</category><category>review</category><category>scrum politics</category><category>social responsibility</category><category>structs</category><category>telecommuting</category><category>the longest journey</category><category>tpm</category><category>trusted computing</category><category>yahoo</category><category>yield</category><title>Beard&#39;s Eye</title><description>Thoughts on software development, life, universe and everything else.</description><link>http://beardseye.blogspot.com/</link><managingEditor>noreply@blogger.com (Anonymous)</managingEditor><generator>Blogger</generator><openSearch:totalResults>26</openSearch:totalResults><openSearch:startIndex>1</openSearch:startIndex><openSearch:itemsPerPage>25</openSearch:itemsPerPage><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-6507802784512764992</guid><pubDate>Wed, 12 Jun 2013 20:16:00 +0000</pubDate><atom:updated>2013-06-12T16:16:35.347-04:00</atom:updated><title>The Code That Needs To Be Written</title><description>I have many good reasons for not wanting to be a manager, but they all boil down to this: I enjoy writing code.&lt;br /&gt;
&lt;br /&gt;
I recently read a &lt;a href=&quot;http://improbabletruths.com/the-code-you-dont-write&quot;&gt;blog post about not writing code&lt;/a&gt;. If you haven&#39;t read it, I encourage you to do so. It&#39;s insightful and right on all counts. It&#39;s also biased, as you can see from the following quote:&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
My job isn&#39;t to write code. It never was. It has always been to make people&#39;s lives better and more productive. &lt;/blockquote&gt;
&lt;br /&gt;
Years and years of experience have taught me to avoid writing code when I can and write as little of it as possible. Every line of code you write has its cost. Some of that cost is the time it took you to write it. Some of it is the time it took you and, hopefully, at least one more person to test it. And then there&#39;s the time it will take you or someone else to maintain it. All in all, writing code is not something to dive into without careful thought.&lt;br /&gt;
&lt;br /&gt;
As a coder, every line of code you don&#39;t write spares you and your team time. The question is, how do you spend that &quot;extra&quot; time? You could deliver the product earlier! Just kidding, it&#39;s more likely you&#39;ll spend that time writing other, more important code.&lt;br /&gt;
&lt;br /&gt;
All joking aside, the above is a very serious, important point: don&#39;t avoid the code for the sake of avoiding code. Write the code that needs to be written. And there&#39;s always &lt;i&gt;some&lt;/i&gt; code that needs to be written. While it&#39;s theoretically possible to build a complete, non-trivial software solution without writing a single line of code, I&#39;ve yet to see that actually happen. At the end of the day, when all the management decisions have been made, someone will probably have to write some code  &lt;br /&gt;
&lt;br /&gt;
There are many ways of making people&#39;s lives better and more productive. Writing code might not be my &lt;i&gt;business&lt;/i&gt;, but it is my &lt;i&gt;calling&lt;/i&gt;.</description><link>http://beardseye.blogspot.com/2013/06/the-code-that-needs-to-be-written.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>0</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-2128250692838018293</guid><pubDate>Thu, 21 Mar 2013 14:03:00 +0000</pubDate><atom:updated>2013-03-21T19:30:08.707-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">empathy</category><category domain="http://www.blogger.com/atom/ns#">punishment</category><category domain="http://www.blogger.com/atom/ns#">PyCon</category><category domain="http://www.blogger.com/atom/ns#">society</category><title>Off With Their Heads: The PyCon Incident and Our Society</title><description>&lt;i&gt;Fair warning: If you&#39;ve come here looking for a quick validation of your arguments for or against Adria Richards, you can go on elsewhere. I&#39;m not interested in &quot;winning a debate&quot; here. Also, I&#39;m a white male, so if you think that makes me unfit to have any opinion on this topic, I also invite you to move on. Still here? Good, let&#39;s have a civilized conversation.&lt;/i&gt;&lt;br /&gt;
&lt;br /&gt;
If you&#39;re part of the software development industry and have looked at the social media over the past two days, you&#39;ll have heard of Adria Richards and what happened at PyCon. Just in case you haven&#39;t, though, here&#39;s the short version: two guys working for some of the sponsors were sitting within the earshot of Adria and making crass jokes involving &quot;forking the repos&quot; and &quot;big dongles&quot;. I invite you to read &lt;a href=&quot;http://butyoureagirl.com/14015/forking-and-dongle-jokes-dont-belong-at-tech-conferences/&quot;&gt;Adria&#39;s blog post&lt;/a&gt; about it before going on.&lt;br /&gt;
&lt;br /&gt;
Right off the bat, let me tell you that I&#39;m not writing this post to condemn either these guys or Adria. Like I implied at the top, I don&#39;t see things black and white here and I think that too many people currently participating in the conversation -- including Adria -- are looking at this as a battle to win. This is something that bothers me a lot and I think something needs to be done about it. Before we get to that part, though, we need to discuss the incident a bit.&lt;br /&gt;
&lt;br /&gt;
&lt;h3&gt;
What Happened?&lt;/h3&gt;
&lt;br /&gt;
The first I heard of this incident was this very morning, when I looked at Hacker News front page and noticed that there was a story that had over 900 points and more than 800 comments. It was a link to a Pastebin text that paints what turned out to be &lt;a href=&quot;http://pastebin.com/JaNh0w5F&quot;&gt;a rather one-sided picture&lt;/a&gt;. My first reaction was outrage: just one paragraph into the text, my imagination conjured a vivid scene of two guys talking in a hallway and someone sneaking up on them to take a photo. I instantly developed a strong dislike for Adria and had to fight that dislike all the time while I was looking for more information. There&#39;s still a part of me that &lt;i&gt;wants&lt;/i&gt; to hate Adria and I say that to illustrate the importance of not acting out on your first impression.&lt;br /&gt;
&lt;br /&gt;
Here&#39;s the thing you need to keep as first and foremost in your mind whenever you stumble upon a situation like this: &lt;b&gt;you don&#39;t know enough&lt;/b&gt;. I don&#39;t know what the words of the joke were. I don&#39;t know if their joke had anything to do with what was being discussed on stage or not. I don&#39;t know if the guy that got fired had a stellar record within his company or was considered problematic. Those are just some examples of things the vast majority of us don&#39;t know even now.&lt;br /&gt;
&lt;br /&gt;
So let&#39;s start by sticking to what we do know:&lt;br /&gt;
&lt;ul&gt;
&lt;li&gt;If you&#39;re at a convention, you represent your company and your actions can reflect on it.&lt;/li&gt;
&lt;li&gt;If you&#39;re sitting together with a bunch of people arranged in rows of seats, your expectation of privacy drops rapidly.&lt;/li&gt;
&lt;li&gt;It is not acceptable to ruin someone else&#39;s experience for them with your conversation. It&#39;s like talking in a movie theater.&lt;/li&gt;
&lt;/ul&gt;
All in all, it was definitely not okay for the guys in question to do what they did and neither Adria nor anyone in her position should be expected to tolerate it. For the record, I also agree that just talking to the guys directly would most likely been ineffective. I&#39;ve seen people deflect the complaint too often to keep believing that this approach works.&lt;br /&gt;
&lt;br /&gt;
That said, things turned ugly on more than one level and this is what I want to discuss here.&lt;br /&gt;
&lt;br /&gt;
&lt;h3&gt;
Off With Their Heads!&lt;/h3&gt;
&lt;br /&gt;
If you look at Adria&#39;s Twitter stream and at some of the blog posts around, you&#39;ll notice something interesting: people have taken sides and are arguing against the other side. It&#39;s like we&#39;ve dug two huge trenches and people in each are now taking potshots at the other trench.&lt;br /&gt;
&lt;br /&gt;
Let&#39;s take a step back and look at the outcome of the incident, without foaming at the mouth. One good thing is that the PyCon Code of Conduct got upheld and it&#39;s been made clear once again that those aren&#39;t just words. Another good thing is that unacceptable behavior, got punished.&lt;br /&gt;
&lt;br /&gt;
On the other hand, one undeniably bad thing that happened is that a person lost a job. I say &quot;a person&quot;, because it should not matter whether it was a man or a woman or what their skin color is. I agree wholeheartedly with &lt;a href=&quot;https://plus.google.com/104757475552569715504/posts/N81SaYUT5xG&quot;&gt;Avdi Grimm&lt;/a&gt; when he says:&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
We need space between &quot;you&#39;re fine&quot; and &quot;you&#39;re fired&quot;.&lt;/blockquote&gt;
Did the guy from PyCon deserve to be fired for his jokes? It depends on a 
number of factors: whether his joke was downright sexist or just 
immature, whether he was fired just for that or he had other problems at
 work, and so on. If we assume -- as most people seem to -- that his 
jokes were merely immature and that he was fired only for that, then I 
would argue that such a drastic disciplinary measure was out of proportion.&lt;br /&gt;
&lt;br /&gt;
Another bad thing that happened is that a huge controversy has been made out of this and the rift between men and women in tech -- a ridiculous rift that shouldn&#39;t exist in the first place -- has likely been made wider.&lt;br /&gt;
&lt;br /&gt;
Nearly everyone is out for blood. Some people are viciously reveling in the fact that one of the guys lost his job, while others are furiously demanding an apology from Adria and SendGrid.&lt;br /&gt;
&lt;br /&gt;
It&#39;s like we fell down the rabbit hole and everyone is either the Queen of Hearts, screaming &quot;off with their heads&quot;, or a part of her army of card soldiers.&lt;br /&gt;
&lt;br /&gt;
It&#39;s too easy for things to spiral out of control and out of proportion and I find it worrying that, in our society, you can cause a social media avalanche with one tweet and set loose a mob on someone. Regardless of whether you feel righteous about it, you should step back and consider all the possible consequences before committing to an extreme action.&lt;br /&gt;
&lt;br /&gt;
&lt;h3&gt;
Punishment, Fear and Empathy&lt;/h3&gt;
&lt;br /&gt;
&lt;i&gt;&lt;b&gt;&lt;span style=&quot;color: #e06666;&quot;&gt;Trigger Warning:&lt;/span&gt;&lt;/b&gt; this section makes references to a high-profile rape case, so you might want to skip to the notice that says &quot;End Trigger Warning&quot;.&lt;/i&gt;&lt;br /&gt;
&lt;br /&gt;
Events at PyCon are really just a reflection of our society&#39;s attitude in general. One recent example that illustrates this attitude is the Steubenville rape case. In case you&#39;re not familiar with it, go Google it, because I&#39;m not going to describe any details here. What I do want to focus on was the way certain media comments were handled.&lt;br /&gt;
&lt;br /&gt;
CNN reporter Poppy Harlow made the following comment about the case:&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
It was incredibly emotional, incredibly difficult even for an outsider like me to watch what happened as these two young men that had such promising futures -- star football players, very good students -- we literally watched as, they believe, their life fell apart.&lt;/blockquote&gt;
The public reaction was one of outrage. Again, my first reaction was similar. However, I later stumbled upon an honest and humble questioning of this reaction by a writer I admire greatly and that made me stop and try to think more critically.&lt;br /&gt;
&lt;br /&gt;
&lt;b&gt;There is no question whatsoever that these two youths deserve their punishment.&lt;/b&gt; I believe there&#39;s no defense or justification of their acts. That doesn&#39;t mean that the situation isn&#39;t tragic, not only for the rape victim, but also for them. These are two people that could have had a much better future, if only we, as a society, could have detected early the flaws in their upbringing and corrected them. It really is tragic to see all three lives damaged so profoundly. That&#39;s why we call certain things tragedies: the net outcome is negative.&lt;br /&gt;
&lt;br /&gt;
Yet the society at large seems to value only the punishment and no one is allowed a modicum of human compassion and empathy for the bigger picture.&lt;br /&gt;
&lt;br /&gt;
&lt;b&gt;&lt;span style=&quot;color: #e06666;&quot;&gt;&lt;i&gt;End Trigger Warning&lt;/i&gt;&lt;/span&gt;&lt;/b&gt;&lt;br /&gt;
&lt;br /&gt;
The problem with the PyCon incident and with our society in general is that we manifest disproportional care about punishments. It&#39;s certainly &lt;i&gt;necessary&lt;/i&gt; to punish inappropriate behavior, but is it &lt;i&gt;enough&lt;/i&gt;?&lt;br /&gt;
&lt;br /&gt;
As a father of a six year old kid, I can emphatically claim that it isn&#39;t. Believe me, it&#39;s a mistake I made earlier and the result was that my son was afraid of me. When he said so, my heart nearly broke. I&#39;ve been careful ever since not to repeat the same mistake.&lt;br /&gt;
&lt;br /&gt;
Another thing I had to learn was that not every offense deserves the same punishment. If your kid peed his pants because he didn&#39;t want to pause the damn video game, it&#39;s not the same thing as throwing a temper tantrum because you told him he can&#39;t play before he does his homework.&lt;br /&gt;
&lt;br /&gt;
You cannot solve a behavioral problem by simply punishing the bad behavior and stopping there. You need to look at the bigger picture and you need &lt;i&gt;empathy&lt;/i&gt;. Fear is a poor problem-solving tool.&lt;br /&gt;
&lt;br /&gt;
I applaud Adria Richards for not tolerating unacceptable behavior and standing up to it, instead. I also disagree with her methods and believe there were better ways of dealing with things.&lt;br /&gt;
&lt;br /&gt;
Most of all, I strongly feel that, instead of glorifying the outcome of the PyCon incident, we should focus and keep looking for ways to change our culture, so that the history doesn&#39;t repeat itself.&lt;i&gt;&amp;nbsp;&lt;/i&gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;i&gt;UPDATE #1 (2013-03-21, 15:02 &lt;a href=&quot;http://www.timeanddate.com/library/abbreviations/timezones/sa/clst.html&quot;&gt;CLST&lt;/a&gt;): &lt;/i&gt;Continuing the trend of punishment-oriented overreaction, it appears that &lt;a href=&quot;http://blog.sendgrid.com/sendgrid-statement/&quot;&gt;SendGrid has decided to fire Adria Richards&lt;/a&gt;. I am still hoping that they were simply hacked and that this is false information spread by the misogynist mob of script kiddies, but that hope is fading fast. I&#39;ve never been more depressed about people illustrating my point for me.&lt;br /&gt;
&lt;br /&gt;
&lt;i&gt;UPDATE #2 (2013-03-21, 19:29 &lt;a href=&quot;http://www.timeanddate.com/library/abbreviations/timezones/sa/clst.html&quot;&gt;CLST&lt;/a&gt;):&lt;/i&gt; SendGrid has just posted an &lt;a href=&quot;http://blog.sendgrid.com/a-difficult-situation/&quot;&gt;official explanation&lt;/a&gt; of their decision. While I still disagree with two people losing their jobs over this whole situation, I wholeheartedly agree with SendGrid&#39;s statement that Adria&#39;s &quot;actions have strongly divided the same community she was supposed to unite&quot;.</description><link>http://beardseye.blogspot.com/2013/03/off-with-their-heads-pycon-incident-and.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>2</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-4189056639569592594</guid><pubDate>Sun, 17 Mar 2013 15:24:00 +0000</pubDate><atom:updated>2013-03-17T13:32:20.328-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">social responsibility</category><category domain="http://www.blogger.com/atom/ns#">society</category><category domain="http://www.blogger.com/atom/ns#">telecommuting</category><category domain="http://www.blogger.com/atom/ns#">yahoo</category><title>Mrs. Mayer&#39;s Major Misstep</title><description>I can&#39;t help feeling that a lot of us have had some rather wild expectations from the 21st century. We watched &quot;Back to the Future&quot; in our teenage years and fantasized about flying garbage-powered cars. We read Asimov and fantasized about having robots throw out our trash while we colonize other planets. We watched Star Trek and the idea of a post-scarcity society filled us with hopes and dreams.&lt;br /&gt;
&lt;br /&gt;
The reality has yet to catch up with science fiction, but we did get to see quite a few technological advances that were nothing short of amazing. Unfortunately, it seems we expected to see the society change for the better at an equal pace as technology and that&#39;s why we keep getting disappointed on a regular basis.&lt;br /&gt;
&lt;br /&gt;
Take telecommuting, for example. The very idea that you can be in a large team doing a complicated job and contribute your part from home would have been considered fantastic when I was a kid. Indeed, it &lt;i&gt;was&lt;/i&gt; a fantasy back then: Asimov&#39;s novel &quot;The Naked Sun&quot; describes a society where almost every social interaction is done via holographic telepresence. Since then, the availability of telecommuting as an option has improved many lives and quite often resulted in a net gain for everyone involved.&lt;br /&gt;
&lt;br /&gt;
Then came Marissa Mayer&#39;s decision to ban telecommuting at Yahoo. Unsurprisingly, the initial reactions across our virtual global world were largely those of outraged criticism. A furious debate sprung up between those who condemn her move as oppressive and those who see in it a logical necessity for Yahoo as a company to catch up with the competition. More than two weeks later, the discussion has slowed down fractionally, but it&#39;s far from stopping, as new articles and posts keep popping up all over. &lt;br /&gt;
&lt;br /&gt;
Although much has been written about benefits and drawbacks of working from home and about whether Marissa&#39;s strategy is good or bad for Yahoo, that&#39;s not really the crux of this debate. Much like the assassination of the Archduke Franz Ferdinand was just a pretext for World War I, but not the real reason for it, the outrage about Mrs. Mayer&#39;s decision stems from its potential effects on the society at large.&lt;br /&gt;
&lt;br /&gt;
It&#39;s amazing how many people still get this wrong. For example, Rebecca Greenfield of The Atlantic Wire &lt;a href=&quot;http://www.theatlanticwire.com/technology/2013/02/chill-out-marissa-mayer-work-home-memos-not-about-you/62548/&quot;&gt;writes that we should chill out&lt;/a&gt;, because Marissa Mayer&#39;s memo is not about us:&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
But when Mayer declares an end to working from home, it does not in fact put an end to working from home elsewhere. Which is a relief because, if we&#39;ve learned anything, some people really don&#39;t want to work in an office all the time.&lt;/blockquote&gt;
It sounds reasonable on the face of it, but it&#39;s either downright disingenuous or just misguided optimism, depending on how generous you feel when you read it.&lt;br /&gt;
&lt;br /&gt;
If you really think Marissa Mayer&#39;s decision is only about Yahoo, then you&#39;re living a sheltered life. Regardless of its position in the Silicon Valley&#39;s pecking order, Yahoo is still a large, prominent company and, as such, it can and does influence the society at large. Right off the bat, there were places where Mayer&#39;s move was taken as a sign that working from home should be banned.&lt;br /&gt;
&lt;br /&gt;
Take, for example, &lt;a href=&quot;http://articles.economictimes.indiatimes.com/2013-02-27/news/37310223_1_indian-homes-indian-companies-productivity&quot;&gt;this article&lt;/a&gt; from The Economic Times. Here&#39;s a choice quote:&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
&quot;Flexitime is a utopian concept that is not going to help anyone,&quot; says K Ramkumar, Executive Director, ICICI Bank. &quot;Whatever is not natural to the market and commerce, will not work. Customer is the king.&quot;&lt;/blockquote&gt;
You might doubt the quality of the source, but it does not matter whether Mr. Ramkumar really said that or not. What matters is that there are numerous companies and managers who think like that and who must have been elated by the implicit validation they received from Mrs. Mayer&#39;s decision.&lt;br /&gt;
&lt;br /&gt;
Here in Chile, the prevalent management style has been given a colorful name: &quot;The Ranch Owner&quot;. What that means, in short, is that a lot of managers here expect their commands to be taken as divine law. For them, the topic of employees&#39; happiness can be summed up with &quot;I pay you to work. Whether you&#39;re happy or not is your problem.&quot;&lt;br /&gt;
&lt;br /&gt;
What I&#39;m driving at is that high-profile decisions in a company like Yahoo have an effect on much more than the company itself. The ripple effects of those decisions can affect a large part of the world.&lt;br /&gt;
&lt;br /&gt;
To mitigate the impact of her decision, all Marissa Mayer had to do was make her telecommuting policy a bit more moderate: allow working from home on a case-to-case basis and, perhaps, choose a better wording that acknowledges telecommuting as an option that can enhance the quality of people&#39;s lives in general.&lt;br /&gt;
&lt;br /&gt;
Instead, she&#39;s being defended and even praised for making a tough, controversial choice for the good of her company. The effects of her decision are being swept under the carpet of &quot;fostering innovation&quot;, by people like Rebecca Cooper in &lt;a href=&quot;http://www.bizjournals.com/washington/blog/techflash/2013/03/marissa-mayer-just-might-be-right.html?page=all&quot;&gt;her article&lt;/a&gt; for Washington Business Journal:&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
If she thinks bringing all her workers together in one place to innovate is the way to rebuild the brand, then I think we should let her try. I&#39;m all for companies trusting their employees and giving them the ability to perform anytime and anywhere they can flourish. But I also think we shouldn&#39;t react so quickly to condemn this CEO for trying some old-fashioned togetherness to reboot Yahoo.&lt;/blockquote&gt;
Even more chilling are the stories that praise her for &lt;a href=&quot;http://www.businessinsider.com/how-marissa-mayer-figured-out-work-at-home-yahoos-were-slacking-off-2013-3&quot;&gt;&quot;basing her decision on data&quot;&lt;/a&gt;. There&#39;s nothing wrong with consulting VPN logs and finding out that a lot of people are abusing their telecommuting privileges. Getting from there to outright banning telecommuting with no case-to-case considerations has nothing to do with data. Promoting the idea that you can make any decision infallible given enough data is downright irresponsible, especially when supported by flawed arguments such as this one:&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
Once, a Google designer quit the company in a huff because he was tired of how Mayer, in charge of how Google.com homepage looked, would choose design elements like color or font not based on taste, but raw data.&lt;br /&gt;
&lt;br /&gt;
For every design variable, she looked at how users interacted with Google with one design — and then the other.&lt;br /&gt;
&lt;br /&gt;
If the data showed users were using Google.com faster one way instead of the other, that particular design choice won out.&lt;br /&gt;
&lt;br /&gt;
It&#39;s hard to argue that Mayer&#39;s process didn&#39;t work for Google. It was not the first search engine on the market, but it&#39;s just about the only one anybody uses now.&lt;/blockquote&gt;
Conflating Google&#39;s success as a search engine with Mayer&#39;s decision making process about &lt;i&gt;Google home page design&lt;/i&gt; is naive, but it takes a whole new level of self-delusion to use that as an argument to blow Mayer&#39;s analysis of VPN logs out of proportion and defend her decision as &quot;data-driven&quot;.&lt;br /&gt;
&lt;br /&gt;
Big companies nowadays parade their social responsibility initiatives 
and milk them for all they&#39;re worth. Yet when it comes to having a care 
about how your words might influence the lives of workers across the 
planet, many people seem to think it&#39;s not such a big deal.&lt;br /&gt;
&lt;br /&gt;
After all, why should we care about the power of words? &quot;Sticks and stones will break my bones, but words will never harm me.&quot; If that&#39;s what you subscribe to, then &lt;a href=&quot;http://www.youtube.com/watch?v=ltun92DfnPY&quot;&gt;maybe you should reconsider&lt;/a&gt;.</description><link>http://beardseye.blogspot.com/2013/03/mrs-mayers-major-misstep.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>2</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-3480079595853910807</guid><pubDate>Mon, 04 Mar 2013 14:27:00 +0000</pubDate><atom:updated>2013-03-04T19:21:32.667-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">discussion</category><category domain="http://www.blogger.com/atom/ns#">education</category><category domain="http://www.blogger.com/atom/ns#">hype</category><title>The Great Programming Debate</title><description>Over the last few years, I&#39;ve noticed a pattern that keeps repeating in Internet forums and blogs:&lt;br /&gt;
&lt;ol&gt;
&lt;li&gt;Someone
 asserts that programming is or should be for everybody, usually 
misrepresenting it as a glamorous and easy job full of benefits and no 
drawbacks, like &lt;a href=&quot;http://www.youtube.com/watch?v=nKIu9yen5nc&quot;&gt;this&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Someone else posts an emphatic counterpoint, like &lt;a href=&quot;http://symbo1ics.com/blog/?p=1615&quot;&gt;this&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;A
 third person tries to point out that a certain degree of &quot;programming 
literacy&quot; should be required and goes on to point out that everyone 
knows (or should know) how to read and write, like &lt;a href=&quot;http://news.ycombinator.com/item?id=5316088&quot;&gt;this&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;The
 discussion devolves into polite bickering, because at this point most 
people are usually past their threshold for completely unbiased 
discussion and are invested in their own point of view.&lt;/li&gt;
&lt;/ol&gt;
I 
happen to have my own point of view, but it usually gets lost in the 
noise, since I invariably try to contribute it in a discussion during 
step 4. Looking back, I can see I&#39;ve weighed in on both sides of the 
debate and I think it might be interesting to try to share a more 
balanced view and, perhaps, bring the people from the two extremes a bit
 closer.&lt;br /&gt;
&lt;br /&gt;
&lt;h3&gt;
Why Programming Should Be For Everybody&lt;/h3&gt;
&lt;br /&gt;
As I pointed out recently, in a similar discussion, I got into
 computers when I was 7 years old. Back then, &quot;getting into computers&quot;, 
at least in my country, meant getting something like ZX Spectrum or 
Commodore and either playing games or learning how to code. Games were 
in English and a lot uglier than their arcade counterparts, so it should
 come as no surprise that most kids who had computers went on to learn 
how to make them &quot;do tricks&quot;. That&#39;s the real allure of programming for a
 kid: you&#39;re typing in seemingly incomprehensible stuff and the machine 
does something. It&#39;s magic!&lt;br /&gt;
&lt;br /&gt;
Not everyone had a 
computer, mainly because we were living in Communist Yugoslavia and it 
was not easy to import one, but nobody told us that computers are &quot;not 
for everyone&quot; and that only trained professionals can program. It was a 
slightly weird hobby to have, but it grew into some of the best years of
 my life: I grew up believing that the world would be transformed and 
that I would be part of it. Some days I felt like a combination of a 
wizard, secret agent and super-hero; other times I felt like a rebel 
straight out of Mentor&#39;s &lt;a href=&quot;http://www.phrack.org/issues.html?issue=7&amp;amp;id=3&quot;&gt;Hacker Manifesto&lt;/a&gt;.&lt;br /&gt;
&lt;br /&gt;
&lt;u&gt;At no point did I feel like a kid who&#39;s playing at being adult and should leave it to professionals.&lt;/u&gt;&lt;br /&gt;
&lt;br /&gt;
You
 might think I&#39;m attacking a straw man here, but this sort of elitism 
seems to be increasingly more prominent in discussions nowadays:&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
&lt;i&gt;Sorry, but those things are technology and should be worked on 
by technologists because there&#39;s literally no on else who can work on 
them&lt;/i&gt;&lt;/blockquote&gt;
That choice quote came from a &lt;a href=&quot;http://news.ycombinator.com/item?id=5292820&quot;&gt;discussion&lt;/a&gt; on Hacker News and it&#39;s the 
kind of extreme point of view that provokes people to react with very 
emphatic counterpoints.&lt;br /&gt;
&lt;br /&gt;
&lt;h3&gt;
Why Programming Isn&#39;t For Everybody &lt;br /&gt;
&lt;/h3&gt;
&lt;br /&gt;
The problem with emphatic
 counterpoints is that they tend to go to the opposite, equally 
unhealthy extreme. If you&#39;ve watched &quot;What Most Schools Don&#39;t Teach&quot;, 
then you know that the intentions and goals behind the video are 
laudable. You probably also noticed that programming has been grossly 
misrepresented.&lt;br /&gt;
&lt;br /&gt;
If
 you didn&#39;t know better, if you were the stereotypical impressionable 
youth, you might have walked away from that video believing that 
programming is easy if you know addition, subtraction and multiplication
 tables and that working as a programmer is a lot of fun, where you get 
free gourmet food, play video games and jam with a band inside an 
awesome office. You might also get an impression that you would be 
programming flying robots, self-driving cars and medical equipment.&lt;br /&gt;
&lt;br /&gt;
As much
 as I agree that programming should be taught in schools -- and I&#39;ll get
 back to that later -- the fact is that programming is not easy and &lt;u&gt;good programming is damn hard&lt;/u&gt;. To make things harder, good programming is a lot more important today, precisely because computers are so widespread. Bad code does have the power to ruin lives. I&#39;m not talking about medical equipment or space shuttle systems here: a bad credit report won&#39;t kill you, but it might ruin your life for years; a badly programmed system might incorrectly match you as a terrorist.&lt;br /&gt;
&lt;br /&gt;
When conveying a positive message in service of a good cause, you must take care not to go to extremes. Nobody tells kids that being a doctor is as easy as putting on a stethoscope and learning which side of the body holds the heart and which the liver. By simultaneously downplaying the difficulties of programming too much and painting a glamorous picture of it, you&#39;re setting kids up for disappointment:&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
&lt;i&gt;We’ve all been raised on television to believe that one day we’d all be millionaires, and movie gods, and rock stars. But we won’t. And we’re slowly learning that fact. And we’re very, very pissed off.&lt;/i&gt;&lt;br /&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
-- Chuck Palahniuk, &quot;Fight Club&quot;&lt;/div&gt;
&lt;/blockquote&gt;
&lt;br /&gt;
Incidentally, you&#39;re also bruising the egos of everyone who has worked their ass off for more than ten years trying to get closer to being a master programmer. While that&#39;s the most forgivable mistake, it still merits being mentioned, because you&#39;re alienating the very people you need the most.&lt;br /&gt;
&lt;br /&gt;
&lt;h3&gt;
Shades of Grey&lt;/h3&gt;
&lt;br /&gt;
In the end, it all boils down to recognizing that things aren&#39;t black and white. Putting the programming on a pedestal, unreachable by all but a select few, is as wrong as representing it as a glamorous job where anyone can be a rock star if they only try.&lt;br /&gt;
&lt;br /&gt;
I agree that all schools should teach a certain level of programming, just as they teach literacy and a certain level of mathematics. In our society, computers are everywhere and everyone should be brought up to understand them and be reasonably proficient at manipulating them.&lt;br /&gt;
&lt;br /&gt;
On the other hand, even though everyone is taught reading and writing, not everyone ends up being a writer. Even those who do are not necessarily successful. Likewise:&lt;br /&gt;
&lt;ul&gt;
&lt;li&gt;Knowing basic biology does not make you a doctor.&lt;/li&gt;
&lt;li&gt;Learning how to play an instrument well enough to play it professionally requires constant practice and even then you might never get to be famous and hugely successful.&lt;/li&gt;
&lt;li&gt;Getting into NBA requires more than being able to run the length of the basketball court.&lt;/li&gt;
&lt;li&gt;Being able to prepare scrambled eggs or even come up with a recipe or two of your own doesn&#39;t make you Jamie Oliver. &lt;/li&gt;
&lt;/ul&gt;
Are these analogies exaggerated? Yes, but so are some messages we offer about programming. Let&#39;s face it, not everyone will have the aptitude to take make programming more than a hobby.&lt;br /&gt;
&lt;br /&gt;
The important thing is that people of all ages need to be free to try it and we need to welcome them. However, it&#39;s equally important to make sure it&#39;s okay for them to fail. Our industry needs all the good hands it can get, but it&#39;s hard enough as it is without adding tons of bad, unhappy programmers who were promised the world and tricked into believing they could get it by wishing upon a star.&lt;br /&gt;
&lt;br /&gt;</description><link>http://beardseye.blogspot.com/2013/03/the-great-programming-debate.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>2</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-4686465799462181278</guid><pubDate>Fri, 22 Feb 2013 02:10:00 +0000</pubDate><atom:updated>2013-02-21T23:10:35.148-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">dreamfall</category><category domain="http://www.blogger.com/atom/ns#">dreamfall chapters</category><category domain="http://www.blogger.com/atom/ns#">games</category><category domain="http://www.blogger.com/atom/ns#">the longest journey</category><title>Another Step in The Longest Journey</title><description>Over the last two weeks, I&#39;ve been tweeting quite often about a certain project on Kickstarter. It&#39;s a game called &lt;a href=&quot;http://www.kickstarter.com/projects/redthread/dreamfall-chapters-the-longest-journey&quot;&gt;&quot;Dreamfall Chapters: The Longest Journey&quot;&lt;/a&gt; and, if it doesn&#39;t ring a bell at all, I want to tell you why I&#39;m so excited about it.&lt;br /&gt;&lt;br /&gt;I&#39;ve always loved video games, for the same reason I love -- and practically devour -- books: they tell me magical stories, take me to incredible places and allow me to do impossible things. The first game I ever played was &quot;Chequered Flag&quot; on ZX Spectrum 48 that my parents gave me when I was 7 years old. Since then, I&#39;ve watched the history of video games unfold and it has been a fascinating journey. &lt;br /&gt;&lt;br /&gt;Right now, the gaming industry is in a state that I find pretty disappointing: there&#39;s a bunch of big players that are mostly staying within the bounds of their franchises and churning out sequels following their tried, true and increasingly tired formulas. My prime example is Ubisoft. They won my heart with &quot;Prince of Persia: The Sands of Time&quot; and &quot;Assassin&#39;s Creed&quot;. Then they broke it with &quot;Assassin&#39;s Creed III&quot;. To be precise, they didn&#39;t break it all at once: they chipped away at it until it broke. It was a blow to me when they abandoned the beautiful &quot;Prince of Persia&quot; reboot in order to squeeze out &quot;The Forgotten Sands&quot; like another bowel movement, but what really eroded my respect for them was the way they kept stretching Desmond&#39;s story in the &quot;Assassin&#39;s Creed&quot; franchise until it felt like I stepped in a piece of bubblegum. Another example is Bethesda, although this one is much more positive, because they haven&#39;t (yet) pissed all over their players. The worst thing they&#39;ve done is the progressive dumbing-down of &quot;The Elder Scrolls&quot;.&lt;br /&gt;&lt;br /&gt;Truly different games are rare and precious and if I had to single out a few, I would certainly start with &quot;Shadow of the Colossus&quot; and &quot;Heavy Rain&quot;. Both throw away most of the conventions held dear by other games and come up with something unique. The former tells a pretty minimalistic story in a profoundly haunting way that leaves you in a pensive mood. The latter combines the best of reading books and playing video games: it&#39;s incredibly rich and immersive like a good book, but interactive and with a flexible story like a good game.&lt;br /&gt;&lt;br /&gt;On the bright side, extreme innovation is not required to get a great game. Sometimes a brilliant execution of old gaming formulas will do the trick, such as Rockstar&#39;s &quot;Red Dead Redemption&quot; and &quot;L.A. Noire&quot;. What makes these games exceptional is the way they explore profound themes through rich stories wrapped in engrossing gameplay.&lt;br /&gt;&lt;br /&gt;This is why I&#39;m so enthusiastic about &quot;Dreamfall Chapters&quot;. For those of you who don&#39;t know the history behind it, it&#39;s the third and final part in the saga that started in 1999 with &quot;The Longest Journey&quot;, a point-and-click adventure. The second part came out a whopping seven years later in the form of a third-person 3D adventure called &quot;Dreamfall&quot;. Both games feature not only exquisitely rich and memorable world-building, but also excellent storytelling, with memorable characters and thought-provoking motifs, and beautiful graphics. And now, in 2013, Ragnar Tørnquist and his team at Red Thread Games are working on the final leg of a truly epic journey, wholly deserving of being called the longest: &quot;Dreamfall Chapters&quot;.&lt;br /&gt;&lt;br /&gt;Now, let&#39;s step back a bit here and focus on one important fact: how much passion, patience and balls does it take to stick to a dream over the course of 14 years? For that alone, I would have been inclined to back &quot;Dreamfall Chapters&quot;, even if I hadn&#39;t played or heard of its prequels.&lt;br /&gt;&lt;br /&gt;That, however, is not the primary reason why I&#39;m trying to do all I can to promote &quot;Dreamfall Chapters&quot;. The real reason is because I know Ragnar and his team are going to produce another masterpiece that will not only bring enjoyment to countless players across the world, but will also bring a breath of fresh air to the gaming industry.&lt;br /&gt;&lt;br /&gt;Here&#39;s why I trust the Red Thread Games team: I first played &quot;Dreamfall&quot;, without even having heard of &quot;The Longest Journey&quot;. I played the whole game and enjoyed it immensely, without having any idea that it&#39;s a sequel. I never felt lost. There was never a &quot;something doesn&#39;t end up&quot; moment or &quot;am I missing something here&quot; feeling. And yet, when I discovered &quot;The Longest Journey&quot; and played it, not only it fit perfectly, but it transformed &quot;Dreamfall&quot; for me. So far, I&#39;ve only had one other experience like that: when reading Discworld books by Sir Terry Pratchett. If you&#39;ve read Pratchett&#39;s work, then you know how high a praise that is.&lt;br /&gt;&lt;br /&gt;By all means, don&#39;t take me wrong: I don&#39;t claim these two games were perfect. &quot;The Longest Journey&quot;, for example, had one of those &quot;dude, wtf?&quot; puzzles that old-style adventure games were so criticized for; just look up &quot;the longest journey duck puzzle&quot; on Google. &quot;Dreamfall&quot; had a totally superfluous combat system that makes a game of &quot;Pong&quot; feel like &quot;Ninja Gaiden&quot; in comparison. Yet, despite these imperfections, these games were -- and remain -- some of the most enjoyable adventure games out there.&lt;br /&gt;&lt;br /&gt;So far I&#39;ve covered why I trust the team, but I haven&#39;t really given much details about why I believe this to be so important. So it&#39;s another adventure game, albeit an excellent one, so what?&lt;br /&gt;&lt;br /&gt;I&#39;m very passionate about games. Apart from enjoying them as a player, I&#39;m also one of those few adult programmers who hasn&#39;t outgrown the dream of making them. For me, it&#39;s important not only what games are made, but who makes them and how.&lt;br /&gt;&lt;br /&gt;Ragnar Tørnquist left Funcom and founded Red Thread Games in order to make &quot;Dreamfall Chapters&quot;. Don&#39;t get me wrong, I have nothing against Funcom at all, but the fact remains that a well-known game designer and one of the iconic figures of game development industry left an established, stable studio and went indie to make this game. And it&#39;s not any game, it&#39;s the long-anticipated finale to a saga that has, despite any imperfections we might find in it, earned its place in the history of video games.&lt;br /&gt;&lt;br /&gt;I want to see it succeed and I want it to be a huge success. Apart from being a work of art, it will also challenge the status quo by showing that it really is possible to make a masterpiece without clinging to the old publisher-developer structure. And, as a bonus, it will contribute to Linux and Mac gaming scene and bring back some of the glory to PC gaming, wresting a bit of power from the tightly controlled walled gardens of the console world. All in all, it should be an important step in a journey even longer than &quot;The Longest Journey&quot;: the journey through the history of gaming.&lt;br /&gt;&lt;br /&gt;If you&#39;re still with me after all these words, then I hope I&#39;ve convinced you to at least go to the &lt;a href=&quot;http://www.kickstarter.com/projects/redthread/dreamfall-chapters-the-longest-journey&quot;&gt;Kickstarter page for &quot;Dreamfall Chapters&quot;&lt;/a&gt; and look at what you find there. And after that, if you feel moved to back the project, then so much the better.&lt;br /&gt;</description><link>http://beardseye.blogspot.com/2013/02/another-step-in-longest-journey.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>1</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-5107602209681073993</guid><pubDate>Sun, 18 Nov 2012 02:14:00 +0000</pubDate><atom:updated>2012-11-17T23:14:23.695-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">bad habits</category><category domain="http://www.blogger.com/atom/ns#">bug reporting</category><title>Fix my car!</title><description>Late Friday afternoon stretched slowly towards the evening. The A/C inside Mike’s auto repair workshop had been fighting a losing battle with the midsummer sun all day. To anyone coming in, the noise and the bustle were disorienting, and the smell of grease, oil and exhaust fumes was nearly overpowering. To Mike himself, it sounded like a hum of a busy day and smelled like the challenge of honest work.&lt;br /&gt;
&lt;br /&gt;
Elbow deep into the hood of a ‘99 Jetta, whistling absentmindedly (and off key), Mike was concentrating on a tricky bit of engine repair, when a tap on his shoulder startled him.&lt;br /&gt;
&lt;br /&gt;
“Excuse me,” came a tentative, almost plaintive voice from behind.&lt;br /&gt;
&lt;br /&gt;
“Be with you in a jiffy,” said Mike, as he began to extricate himself. In just a few seconds, he was free and wiping his hands and arms on a rag as he studied the neatly dressed, bespectacled young man in front of him. “Hi, I’m Mike. How can I help you?”&lt;br /&gt;
&lt;br /&gt;
“Yeah, hi. My girlfriend’s car is broken and I need someone to fix it.”&lt;br /&gt;
&lt;br /&gt;
“Well, then you’ve certainly come to the right place! Bring it in and let’s have a look at it.”&lt;br /&gt;
&lt;br /&gt;
“Uhm, I can’t. It’s not here. It’s parked in my driveway.”&lt;br /&gt;
&lt;br /&gt;
“Ohh-kay.” It took Mike all the self-control he learned in more than a decade of running his shop to keep his face politely impassive. ”What make and model is it?”&lt;br /&gt;
&lt;br /&gt;
“It’s... um, I’m not sure. It’s green.”&lt;br /&gt;
&lt;br /&gt;
“Green. Right.” Silently counting to ten in the back of his mind, Mike tried a different tack. “Right. And what’s wrong with it?”&lt;br /&gt;
&lt;br /&gt;
“I don’t know,” the young man in front of him huffed exasperatedly, “I was hoping you could tell me.”&lt;br /&gt;
&lt;br /&gt;
“What I’m asking,” said Mike, surprised to find he’s not gritting his teeth, “is why you say the car is broken? What happened?”&lt;br /&gt;
&lt;br /&gt;
“Oh. That. Sorry. It doesn’t move.”&lt;br /&gt;
&lt;br /&gt;
“Uh huh... Anything, um, more specific?”&lt;br /&gt;
&lt;br /&gt;
“Nope. Don’t know the details.”&lt;br /&gt;
&lt;br /&gt;
“Well then,” shrugged Mike apologetically. “I’m afraid there’s not much I can do to help.”&lt;br /&gt;
&lt;br /&gt;
“Okay, fine. Hang on a second.” With that, the young man whipped out a cell phone and punched a number into it. “Hi, honey. Yeah, I’m at the auto repair shop. Look, I need to know what’s wrong with the car... Yeah, I know, that’s what I said too, but they need more details to do their job... I know, right? … Uh huh. Uh huh... Okay, great. Love you. Buh-bye!”&lt;br /&gt;
&lt;br /&gt;
“Right, so, the problem is that it doesn’t move when she puts it in the gear. She steps on the pedal and the car just sits there and roars. The wheels won’t move.”&lt;br /&gt;
&lt;br /&gt;
“Okay,” said Mike, “now we’re getting somewhere. It sounds like a transmission problem.”&lt;br /&gt;
&lt;br /&gt;
“Transmission? What do you mean? What’s it transmitting and to whom?”&lt;br /&gt;
&lt;br /&gt;
“No, no, it’s the mechanism in your car that allows the wheels to move when you put it into gear. That mechanism seems to be the problem.”&lt;br /&gt;
&lt;br /&gt;
“Right, that’s what I just told you. Now, when will you fix it and how much will it cost?”&lt;br /&gt;
&lt;br /&gt;
“Well, when you bring it in, I can look into it and diagnose the exact problem.”&lt;br /&gt;
&lt;br /&gt;
“Didn’t you just say that this transmission thingamajig is the problem?”&lt;br /&gt;
&lt;br /&gt;
“Yes, but the transmission mechanism is a complex system with lots of components. I need to look at it to find out exactly what is broken and how to fix it.”&lt;br /&gt;
&lt;br /&gt;
A suspicious look settled on the young man’s face. “I don’t know,” he said, arms crossed, “It seems to me like you should be able to tell me how to replace this transmission system.”&lt;br /&gt;
&lt;br /&gt;
Mike tried to find something to reply to that and failed. He opened his mouth a few times, but before he could come up with a way to deal with this Kafkaesque situation, the young man threw his hands up in the air and said, “Fine! Alright. I’ll bring it in tomorrow. Anything else?”&lt;br /&gt;
&lt;br /&gt;
“Not that I can think of,” said Mike cautiously.&lt;br /&gt;
&lt;br /&gt;
“Good. See you tomorrow, then.”&lt;br /&gt;
&lt;br /&gt;
“Unless I’m having a nightmare,” murmured Mike into his beard, as he watched the young man walk out in a huff. Slowly shaking his head, not sure whether what just happened had been real, Mike turned back to the Jetta. “Thank heavens my customers usually aren’t like that...”&lt;br /&gt;
&lt;br /&gt;
&lt;hr /&gt;
&lt;br /&gt;
So what do you think of Mike’s story? Did it sound absurd, almost surreal? Granted, there are all kinds of people out there, so it’s not impossible to stumble into someone like Mike’s strange client, but it sure is uncommon.&lt;br /&gt;
&lt;br /&gt;
What if I told you the young man in the story worked in a car factory? Wouldn’t that make the story outlandish?&lt;br /&gt;
&lt;br /&gt;
And yet that’s precisely what happens quite often in software development. Many smart people have already spent lots and lots of words on &lt;a href=&quot;http://www.chiark.greenend.org.uk/~sgtatham/bugs.html&quot;&gt;how to report bugs&lt;/a&gt;, &lt;a href=&quot;http://blogs.msdn.com/b/oldnewthing/archive/2010/04/22/10000406.aspx&quot;&gt;how to ask for help&lt;/a&gt;, &lt;a href=&quot;http://blogs.msdn.com/b/oldnewthing/archive/2010/04/21/9999675.aspx&quot;&gt;how to follow up&lt;/a&gt; and &lt;a href=&quot;http://catb.org/~esr/faqs/smart-questions.html&quot;&gt;how to ask questions&lt;/a&gt; in general. I don’t have much to add to all that, but the next time you complain that some piece of software isn’t working without giving any relevant details, remember this story. Doubly so if you actually work in something related to software development.</description><link>http://beardseye.blogspot.com/2012/11/fix-my-car_17.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>0</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-2408186735109057471</guid><pubDate>Fri, 23 Mar 2012 03:50:00 +0000</pubDate><atom:updated>2012-03-23T11:15:52.048-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">creative process</category><category domain="http://www.blogger.com/atom/ns#">creativity</category><category domain="http://www.blogger.com/atom/ns#">game design</category><category domain="http://www.blogger.com/atom/ns#">games</category><category domain="http://www.blogger.com/atom/ns#">mass effect</category><title>My Playthrough On Your Fridge Door</title><description>Two days ago, I decided to take a brief break at work. Since I was still gloomy from having finished Mass Effect 3 that weekend, I went surfing the Web to see what&#39;s new on that front and stumbled upon a couple of articles that inspired me to write a new &lt;a href=&quot;http://beardseye.blogspot.com/2012/03/are-players-just-audience.html&quot;&gt;post&lt;/a&gt; on my blog. Only two days have passed since, but it seems much longer.&lt;br /&gt;
&lt;br /&gt;
To be honest, I didn&#39;t expect to succeed at generating any kind of discussion about this. Most people, perhaps predictably, responded with a knee-jerk reaction along the lines of &quot;Yes, you&#39;re just audience, so shut up!&quot; Finding that there were people who agreed with me was quite gratifying. Finding that there were people who disagreed and wanted to discuss it, however, took me completely -- and pleasantly -- by surprise.&lt;br /&gt;
&lt;br /&gt;
I don&#39;t blog much. To quote Snoopy, &lt;a href=&quot;http://www.goodreads.com/quotes/show/374639&quot;&gt;&quot;there&#39;s no sense in doing a lot of barking if you don&#39;t really have anything to say&quot;&lt;/a&gt; and I usually don&#39;t have much to say that other people haven&#39;t already said more eloquently or elegantly. To find myself blogging for the second time in one week is a new experience. I&#39;d like to thank all the people who took their time to comment and to converse with me in various forums: thanks for inspiring me to write more.&lt;br /&gt;
&lt;br /&gt;
Enough of meta-blogging. Let&#39;s get to the meat of the discussion: over the past two days, I&#39;ve seen a few valid and interesting counter-arguments to my claim that players aren&#39;t merely an audience any more. I&#39;d like to address those arguments here. (By the way, if you haven&#39;t finished Mass Effect 3 and don&#39;t want spoilers, be careful when following links from here.)&lt;br /&gt;
&lt;br /&gt;
Let&#39;s start with the one that&#39;s easiest to refute and take it from there.&lt;br /&gt;
&lt;br /&gt;
&lt;h3&gt;


&lt;b&gt;Playing Games is Not Art&lt;/b&gt;&amp;nbsp;&lt;/h3&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;&lt;a href=&quot;http://www.gocomics.com/calvinandhobbes/1987/01/25&quot;&gt;Hobbes: Look, it&#39;s just a game. &lt;/a&gt;&lt;br /&gt;&lt;a href=&quot;http://www.gocomics.com/calvinandhobbes/1987/01/25&quot;&gt;Calvin: I know! You should see me when I lose in real life!&lt;/a&gt;&lt;/i&gt;&lt;/span&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.gocomics.com/calvinandhobbes/1987/01/25&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;-- &quot;Calvin and Hobbes&quot; by Bill Waterson&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;br /&gt;
No, it&#39;s not. I agree completely. Maybe some day, playing certain kinds of games will become an art form, but right now, playing games is definitely not art.&lt;br /&gt;
&lt;br /&gt;
Thing is, I never claimed it was. Don&#39;t get me wrong, I would love to see the day when there are games that allow players to use them as a medium for their artistic expression. But for now, what I claim is that playing games -- at least some games -- is an act of creativity. I also claim that players, by playing the game, complete the creative process the game developers initiated.&lt;br /&gt;
&lt;br /&gt;
Although not as strong as &quot;playing games is art&quot;, this claim still sounds weird. The reasoning that rejects this claim tends to fall into two categories. The first centers on:&lt;br /&gt;
&lt;br /&gt;
&lt;h3&gt;


&lt;b&gt;Finite, Limited Universe of Expression&lt;/b&gt;&lt;/h3&gt;
&lt;br /&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.goodreads.com/quotes/search?q=to+be+a+true+poet+is+to+become+god%2C+Dan+Simmons&amp;amp;commit=Search&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;My arena of literal expression offered&amp;nbsp;&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.goodreads.com/quotes/search?q=to+be+a+true+poet+is+to+become+god%2C+Dan+Simmons&amp;amp;commit=Search&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;four avenues to the topic of elimination,&amp;nbsp;&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.goodreads.com/quotes/search?q=to+be+a+true+poet+is+to+become+god%2C+Dan+Simmons&amp;amp;commit=Search&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;two references to human anatomy,&amp;nbsp;&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.goodreads.com/quotes/search?q=to+be+a+true+poet+is+to+become+god%2C+Dan+Simmons&amp;amp;commit=Search&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;one request for divine imprecation,&amp;nbsp;&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.goodreads.com/quotes/search?q=to+be+a+true+poet+is+to+become+god%2C+Dan+Simmons&amp;amp;commit=Search&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;one standard description of or request for coitus,&amp;nbsp;&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.goodreads.com/quotes/search?q=to+be+a+true+poet+is+to+become+god%2C+Dan+Simmons&amp;amp;commit=Search&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;and a coital variation which was no longer an option for me&amp;nbsp;&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.goodreads.com/quotes/search?q=to+be+a+true+poet+is+to+become+god%2C+Dan+Simmons&amp;amp;commit=Search&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;&amp;nbsp;since my mother was deceased.&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.goodreads.com/quotes/search?q=to+be+a+true+poet+is+to+become+god%2C+Dan+Simmons&amp;amp;commit=Search&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;All in all, it was enough.&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;&lt;a href=&quot;http://www.goodreads.com/quotes/search?q=to+be+a+true+poet+is+to+become+god%2C+Dan+Simmons&amp;amp;commit=Search&quot;&gt;-- Martin Silenus, from &quot;Hyperion&quot; by Dan Simmons&lt;/a&gt; &lt;/i&gt;&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;
Arguments in this category reject players&#39; participation in the creative process because said players are restricted to a finite and rather limited set of options, rigidly established in computer code by game developers. As Michael Rohde &lt;a href=&quot;http://www.goozernation.com/video-games/index.php/news/1466-refunding-gamers-their-money-for-mass-effect-3-and-the-impact-on-the-industry#comment-1394&quot;&gt;neatly put it&lt;/a&gt;:&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
[...] you are playing code, you are not helping write the story line, you do not have the ability to write unique aspects of the game. You are navigating a trail with finite endings.&lt;/blockquote&gt;
The problem with this argument is that it&#39;s ultimately quantitative in nature. Just because English language has a finite number of ways of putting its words together in non-gibberish ways, doesn&#39;t meant that writing a sonnet -- a rigidly defined and limited form in itself -- is not a creative act. We have Shakespeare to back that one up.&lt;br /&gt;
&lt;br /&gt;
To put it differently, if you&#39;re given a small set of Lego pieces, you can still put them together in a unique, creative way. In case of Mass Effect games, the sheer combinatory explosion of possible choices to take in the game is at least comparable to a box of Legos. Even though those choices converge on an extremely small set of final outcomes, the act of making those choices deliberately is no less creative.&lt;br /&gt;
&lt;br /&gt;
That&#39;s the second problem with this argument: by focusing only on the creative aspect of the game&#39;s fixed assets, such as the story or the cinematics, it misses the fact that these assets themselves serve as a vehicle for players&#39; creative expression. What each player in Mass Effect 3 creates is the &lt;i&gt;character&lt;/i&gt; of Shepard and his or her &lt;i&gt;arc&lt;/i&gt;. Aleksander Adamkiewicz &lt;a href=&quot;http://www.gamasutra.com/view/news/163490/Opinion_Mass_Effect_3s_promise_of_ownership_is_one_it_couldnt_keep.php#comment143103&quot;&gt;expressed this nicely&lt;/a&gt; in his comment on a &lt;a href=&quot;http://www.gamasutra.com/view/news/163490/Opinion_Mass_Effect_3s_promise_of_ownership_is_one_it_couldnt_keep.php&quot;&gt;related Gamasutra article&lt;/a&gt;:&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
[...] I always thought Shepard was an empty vessel for the player to fill with their interpretation. [...] I never found that Shep undergoes any character development in the narrative of the game, yes he underwent a development -in my head- but not in the game itself.&lt;/blockquote&gt;
That is precisely the allure of Mass Effect series, the characteristic that sets it apart from so many other adventures and RPGs. I posit that almost all players go through this &quot;Shepard-development in their heads&quot;. Some do it more consciously, more deliberately than others, but everyone who&#39;s playing the game for more than &quot;shooting stuff&quot; does it.&lt;br /&gt;
&lt;br /&gt;
Still, even this might not be enough to accept the players as creative agents. There&#39;s still one more category of counter-arguments to tackle:&lt;br /&gt;
&lt;br /&gt;
&lt;h3&gt;


&lt;b&gt;The Question of Intent&lt;/b&gt;&lt;/h3&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.quotationspage.com/quote/470.html&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;The most exciting phrase to hear in science,&amp;nbsp;&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.quotationspage.com/quote/470.html&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;the one that heralds new discoveries,&amp;nbsp;&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.quotationspage.com/quote/470.html&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;is not ‘Eureka!’, but ‘That’s funny …’&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.quotationspage.com/quote/470.html&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;-- Isaac Asimov&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;br /&gt;
My son&#39;s fifth birthday was in February, when most of his friends are out of town on vacation, so we threw a party for them last Saturday. This year, the main attraction was a magician. At one point, he invited my son on stage to &quot;grant him magical powers&quot;, so he can &quot;assist&quot; the magician.&lt;br /&gt;
&lt;br /&gt;
Asking for a volunteer in the audience and inviting him or her to participate is a standard part of a good magician&#39;s repertoire. However, this act of participation does not turn the audience member into anything more than audience. Their participation does not promote them to the status of the magician.&lt;br /&gt;
&lt;br /&gt;
Why, then, do I argue that the act of playing a videogame can promote a player into a creator?&lt;br /&gt;
&lt;br /&gt;
The crucial difference is in the intent. The magician doesn&#39;t intend to truly empower you, while a good game designer strives to do precisely that. Game developers have been acknowledging the players&#39; desire for creative expression for years. One small example is Burnout Revenge, a game whose Xbox 360 version turned six years this month. A notable addition in the Xbox 360 edition of Burnout Revenge is the &lt;a href=&quot;http://en.wikipedia.org/wiki/Burnout_Revenge#Xbox_360_version&quot;&gt;Burnout Clips feature&lt;/a&gt;, which allows you to create video clips of your offline races and share them with other players.&lt;br /&gt;
&lt;br /&gt;
A much more important example is Little Big Planet, along with its sequel. The sequel, in particular, is the best example of a game whose sole reason for existence is to allow players to create their own content. Its single player campaign, unlike the one from the first game, is primarily a showcase for the new game mechanics you can use in your own levels.&lt;br /&gt;
&lt;br /&gt;
And let&#39;s not forget other, much older &quot;build your own stuff&quot; games, such as Sim City. Sid Meier took this idea and turned it into an art form, if you&#39;ll pardon my choice of words, with his Civilization games.&lt;br /&gt;
&lt;br /&gt;
Okay, I&#39;m digressing and I&#39;m sure you get the point. Let&#39;s get back to the game that sparked this whole discussion.&lt;br /&gt;
&lt;br /&gt;
Was there intent to empower in Mass Effect? I would say that if there wasn&#39;t, BioWare is the luckiest company on the Earth. That kind of repeated serendipity is difficult to believe in.&lt;br /&gt;
&lt;br /&gt;
Yes, I dare say that Mass Effect was deliberately designed to empower the players to build their own Shepard character in their heads.&lt;br /&gt;
&lt;br /&gt;
Is this an act of creativity? I believe it is. If you see a reason why it shouldn&#39;t be considered as such, feel free to drop me a comment.&lt;br /&gt;
&lt;br /&gt;
More importantly, has this creativity been deliberately encouraged by BioWare? I don&#39;t see any room for doubt here.&lt;br /&gt;
&lt;br /&gt;
Incidentally, this is precisely why so many players are so upset about the ending. Like I already said in my previous post, the ending &lt;i&gt;completely invalidates&lt;/i&gt; the players&#39; choices. Yes, we&#39;re all used to our choices finally converging on a small set of endings, as in the previous two installments. However, for various reasons (which I won&#39;t reveal in order to avoid spoilers) the ending(s) in Mass Effect 3 are different: players&#39; choices, instead of converging, are simply discarded and ignored.&lt;br /&gt;
&lt;br /&gt;
&lt;h3&gt;


&lt;b&gt;Does Any of This Matter At All?&lt;/b&gt;&lt;span id=&quot;d980913&quot; style=&quot;display: block;&quot;&gt;&lt;/span&gt;&lt;/h3&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.sluggy.com/comics/archives/daily/19980913&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;Bun-Bun: This is pretty crappy paraphrasing.&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.sluggy.com/comics/archives/daily/19980913&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;Riff: You should see what I have to work with! It&#39;s in crayon!&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.sluggy.com/comics/archives/daily/19980913&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;-- &quot;Sluggy Freelance&quot; by Pete Abrams&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;br /&gt;
People who have children are very well acquainted with one amusing fact about creativity: the vast majority of little kids completely suck at drawing. That&#39;s perfectly normal for their age, though, so we ignore that fact and we praise them for their creativity and stick their doodles on the fridge door or on our cubicle wall. It&#39;s not because they&#39;re great works of art. It&#39;s because they matter to us, because our own children drew them.&lt;br /&gt;
&lt;br /&gt;
Similarly, most of players&#39; creations in games that encourage creativity are, well, the digital equivalent of a five-year-old&#39;s &quot;crayon art&quot;. It&#39;s up to game developers to define their stance towards our creations.&lt;br /&gt;
&lt;br /&gt;
Am I wrong to believe that they should try to be like proud parents, indulgent and encouraging? Only time will tell. In the mean time, let me know what &lt;i&gt;you&lt;/i&gt; think.</description><link>http://beardseye.blogspot.com/2012/03/my-playthrough-on-your-fridge-door.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>2</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-1668092741700959745</guid><pubDate>Tue, 20 Mar 2012 18:02:00 +0000</pubDate><atom:updated>2012-03-23T00:56:40.418-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">creative process</category><category domain="http://www.blogger.com/atom/ns#">creativity</category><category domain="http://www.blogger.com/atom/ns#">game design</category><category domain="http://www.blogger.com/atom/ns#">games</category><category domain="http://www.blogger.com/atom/ns#">mass effect</category><title>Are Players Just Audience?</title><description>&lt;br /&gt;
&lt;span style=&quot;font-size: large;&quot;&gt;&lt;b&gt;Part 1: Sisyphus&lt;/b&gt;&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
Over the last weekend, I finished my first playthrough of Mass Effect 3. For the moment, it looks like it will be the only one. My immediate reaction was that of vague disappointment. I felt dissatisfied, but I couldn&#39;t put my finger on why. It took me a bit of introspection to come to the same conclusion I later saw in countless blog posts, tweets and articles: the ending completely invalidated all the effort I had put into the game.&lt;br /&gt;
&lt;br /&gt;
All those hours of chosing my answers carefully, pondering what I would do if I had to face a situation like the one Shepard was facing, wondering whether I was being consistent and, in general, carefully crafting a character and learning more about myself in the process -- all that effort was for nothing. The game gave me three options, all of them cataclysmic and none of them even remotely satisfying. Under my guidance, Shepard had achieved what no sentient life form had ever achieved before, but it didn&#39;t matter, the choices were still the same. One of those choices went completely against every major decision I have ever made, since Mass Effect 1, but that didn&#39;t matter, the choices were still the same. And after I took one that looked the least unpleasant, I was treated to a cinematic that opened new questions while providing no answers whatsoever and left with no closure and a bad taste in my mouth.&lt;br /&gt;
&lt;br /&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.youtube.com/watch?v=Dlc9pPeS2Q0&quot;&gt;&lt;i&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&quot;Charlie Chaplin entered a Charlie Chaplin look-alike&lt;br /&gt; contest in Monte Carlo and came in third. Now that&#39;s a story. &lt;br /&gt;This... is something else.&quot;&lt;/span&gt;&lt;/i&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;i&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;a href=&quot;http://www.youtube.com/watch?v=Dlc9pPeS2Q0&quot;&gt;-- Mr. Goodkat&lt;/a&gt;&lt;/span&gt;&lt;/i&gt;&lt;/div&gt;
&lt;br /&gt;
Later, I sat down to read about other people&#39;s reactions. A small part of me still wishes I hadn&#39;t done that. Up until then, my misery was based only on my own experience. And then I read about other endings. Nominally, there are sixteen of them. In reality, it&#39;s 90% the same ending, with variations in small details. In other words, it wasn&#39;t merely something I had done in this playthrough. No matter what I did, no matter how many times I played the game again and again, I would never get a significantly different ending.&lt;br /&gt;
&lt;br /&gt;
They say that misery loves company and it turned out to be true in the end. While finding out about other endings made things worse, I was strangely comforted by the fact that I wasn&#39;t alone and that other fans were speaking up about this and voicing their -- or rather, our -- outrage. Further reading brought small glimmers of hope: official Bioware responses could be interpreted in a way that left open the possibility of changing the ending.&lt;br /&gt;
&lt;br /&gt;
I wasn&#39;t going to write about any of this. Others have done that more eloquently and I&#39;ve never acquired the taste for shouting &quot;me too&quot; in the crowd. What changed my mind was &lt;a href=&quot;http://www.theverge.com/gaming/2012/3/19/2885173/the-argument-over-mass-effect-3s-ending-makes-ken-levine-sad&quot;&gt;Ken Levine&#39;s reaction&lt;/a&gt;. And then Ben Dutka over at PSX Extreme &lt;a href=&quot;http://www.psxextreme.com/ps3-news/10858.html&quot;&gt;took it to another level&lt;/a&gt;. And suddenly I couldn&#39;t stay silent anymore.&lt;br /&gt;
&lt;br /&gt;
This argument wasn&#39;t unexpected. On the contrary, it&#39;s a fairly standard line of defense and usually it&#39;s completely valid in certain aspects. Usually, but not anymore. Videogames have changed that and people like Ken Levine and Ben Dutka are either not aware of it or don&#39;t want to accept it.&lt;br /&gt;
&lt;br /&gt;
Before I get to what has changed, however, I&#39;d like to go over the things that haven&#39;t. It&#39;s not just my flair for dramatic. I want to clear the air first. As a geek, I have a natural distaste for bullshit in all its forms and I want to make certain things clear to both my readers and the people out there who are making their righteous statements about artistic integrity.&lt;br /&gt;
&lt;br /&gt;
Ben Dutka is right, up to a point: we don&#39;t have the right to walk up to an artist and demand from them to change their work. Or rather, we don&#39;t have the right to have that work changed just because we demanded it; whether we demand it or not is merely a question of etiquette, not of rights. We have not been granted any intellectual ownership rights by purchasing the work of art or anything of the kind. We&#39;ve been granted the right to experience that work of art. Whether we enjoy it or not is immaterial.&lt;br /&gt;
&lt;br /&gt;
The problem with that attitude is that people use it to hide from the rest of the ugly truth: art has never been able to stand completely on its own. It&#39;s not just an independent fact, an axiom. It&#39;s the product of a process in which there are several crucial roles. If we don&#39;t acknowledge this, we&#39;re simply not being honest.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-size: large;&quot;&gt;&lt;b&gt;Part 2: Apollo&lt;/b&gt;&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
A work of art needs an Artist to make it and an Audience to appreciate it. Otherwise, it&#39;s simply a tree that fell in the forest with nobody to see or hear its fall. The cold, hard, ugly truth is that the Artist needs a Patron. Either that or a Day Job, but we&#39;ll get to that in a moment. The point is that the Artist, like any other human being, needs to earn a living.&lt;br /&gt;
&lt;br /&gt;
Patrons used to be reasonably wealthy people who would commission Artists for specific projects, e.g. &lt;a href=&quot;http://en.wikipedia.org/wiki/Mona_Lisa&quot;&gt;&quot;Paint my wife, dude, I need something to hang in my new home!&quot;&lt;/a&gt; Over the time, the business model shifted. The patronage became crowdsourced and the Middlemen stepped on the stage. Nowadays, the Artist will typically go to a bunch of Middlemen for help with any and all of the following: getting the means to create a work of art, gathering an Audience for a work of art, putting the Artist together with other Artists to collaborate on a work of art and, ultimately, getting paid for a work of art.&lt;br /&gt;
&lt;br /&gt;
The rise of Middlemen wasn&#39;t the only consequence of patronage being crowdsourced. Art and entertainment started overlapping a lot more than before. The Audience demands what Kurt Cobain sang: &quot;Here we are now, entertain us!&quot; You want the patronage? &lt;b&gt;Deliver!&lt;/b&gt;&lt;br /&gt;
&lt;br /&gt;
Like it or not, but the fact remains that the most famous modern Art is inextricably entangled with Entertainment and Business. Sure, you can still adamantly refuse to compromise your artistic integrity. If you&#39;re very, very good, you might be able to get away with it. If not, you still have a perfectly valid option of practicing your art as a &lt;i&gt;hobby&lt;/i&gt;. The stereotypical image of &quot;Starving Artist&quot; comes from people who refuse to accept this reality and try to be &lt;i&gt;professional&lt;/i&gt; artists without an impressive track record and with rigid artistic integrity.&lt;br /&gt;
&lt;br /&gt;
Ben Dutka can lament this reality as much as he wants. He can warn us that it leads to Art degenerating into &quot;some hideous, mutated, mass-generated assembly of likely sophomoric compromises&quot;. Guess what? We&#39;ve already arrived there. That place is called &lt;a href=&quot;http://www.huffingtonpost.com/john-farr/just-when-did-the-quality_b_207239.html&quot;&gt;Hollywood&lt;/a&gt;. The last Hollywood movie I myself have seen that stuck out as memorable work of art was &quot;Inception&quot; and even that was watered-down art.&lt;br /&gt;
&lt;br /&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://www.youtube.com/watch?v=cWB9vsgmtyM&quot;&gt;&lt;i&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&quot;Tell it to the one legged man so he can bump it off down the road.&quot;&lt;/span&gt;&lt;/i&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;i&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;a href=&quot;http://www.youtube.com/watch?v=cWB9vsgmtyM&quot;&gt;-- Sloe&lt;/a&gt; &lt;/span&gt;&lt;/i&gt;&lt;/div&gt;
&lt;br /&gt;
Do I want videogames to degenerate into Hollywood-like decadence? Hell, no! Things are bad as they are: just look at Assassin&#39;s Creed and how it&#39;s getting stretched beyond recognition, worse than the &quot;Wheel of Time&quot; used to be before Brandon Sanderson took over. No, thanks. I&#39;m all for artistic integrity. But let&#39;s not pretend that things are not the way they are.&lt;br /&gt;
&lt;br /&gt;
And, by all means, let&#39;s not allow the defense of artistic integrity to become the reason for the Middlemen to further crush it and suppress it. Get it in your heads, Artists, those Middlemen &lt;i&gt;have&lt;/i&gt; the power to crush your integrity and they will not hesitate to use it, because that&#39;s how they make &lt;i&gt;their&lt;/i&gt; living.&lt;br /&gt;
&lt;br /&gt;
Mass Effect 3 has the potential to become the textbook argument for publishers to impose their will on artists: &quot;Yeah, we let you guys do your artsy-fartsy stuff in Mass Effect 3 and look how &lt;i&gt;that&lt;/i&gt; turned out. Look at this sales chart. Look at it, damn you!&quot;&lt;br /&gt;
&lt;br /&gt;
I&#39;m sorry you&#39;re sad, Ken Levine, but you&#39;re also wrong. Leonardo da Vinci was told to paint the wife of Francesco del Giocondo. He could have decided to paint the lake behind her, instead, but he didn&#39;t. He did the work he was hired to do. Was he &quot;disappointed in the emotional feeling&quot; he got because he &quot;didn&#39;t really create it&quot;? I guess we can&#39;t find out because he&#39;s not around to give us an interview, but I look at La Gioconda&#39;s smile and I doubt it.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-size: large;&quot;&gt;&lt;b&gt;Part 3: Orpheus&lt;/b&gt;&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
None of that, however, was my main point. All of the preceding discussion applies to the traditional artistic media. The videogames have changed the equation and that&#39;s really the bone I want to pick with Ben Dutka.&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
I think the fact that video games are interactive, and the fact that gamers have more control than ever (customization, user-creativity, etc.) has confused things but the fact remains, a story is a story. It&#39;s art, regardless of the medium.&lt;/blockquote&gt;
I take issue with that claim. It&#39;s the same logic that RIAA and MPAA use in their claims and in their fight for SOPA and PIPA: &quot;Nothing has changed.&quot; Everything has changed, Ben, and it&#39;s time to face it.&lt;br /&gt;
&lt;br /&gt;
You cannot be a game designer without learning one important lesson: you are not telling a story, you are helping your players create their own stories. One of my most favorite books on my non-fiction bookshelf is &quot;Game Architecture and Design&quot;, by Andrew Rollings and David Morris. I still remember the sense of wonder and realization when I read the following words:&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
Even puzzle games like Tetris in one sense tell a story and are thus &quot;dramatic&quot; to an extent.&lt;/blockquote&gt;
Reading those words, I flashed back on all the times my buddy and I would get together and excitedly discuss the game of Descent II we just played over the modem; I remembered the &lt;a href=&quot;http://www.elvenrunes.com/&quot;&gt;Elven Runes&lt;/a&gt; website where people would post their logs after an exciting session of &lt;a href=&quot;http://mume.org/&quot;&gt;MUME&lt;/a&gt;. The image these words painted in my mind was surprisingly vivid: a bunch of people gathered around a fire at night, swapping stories. For a guy who has never had an actual campfire experience, that image was most alluring and the realization that this experience is, in a sense, what videogames bring to countless players around the world was... mindblowing, to put it mildly.&lt;br /&gt;
&lt;br /&gt;
Just what role does your story play in the game you create? That&#39;s not a new debate. Just ask Ernest Adams &lt;a href=&quot;http://www.designersnotebook.com/Lectures/Interactive_Narratives_Revisit/interactive_narratives_revisit.htm&quot;&gt;what he thinks about it&lt;/a&gt;. Still not convinced? Digital Worlds has &lt;a href=&quot;http://digitalworlds.wordpress.com/2008/03/31/do-game-players-tell-or-create-stories/&quot;&gt;a post about the whole debate&lt;/a&gt;.&lt;br /&gt;
&lt;br /&gt;
Here&#39;s the bottom line: we, the players, participate in the creative process and our participation is crucial because it&#39;s the final step. The players &lt;i&gt;complete&lt;/i&gt; the creative process.&lt;br /&gt;
&lt;br /&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;a href=&quot;http://movieclips.com/HQds-lucky-number-slevin-movie-the-rabbi/&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;&quot;I live on both sides of the fence. My grass is always green.&quot;&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;i&gt;&lt;a href=&quot;http://movieclips.com/HQds-lucky-number-slevin-movie-the-rabbi/&quot;&gt;-- The Rabbi&lt;/a&gt; &lt;/i&gt;&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;
And &lt;i&gt;that&lt;/i&gt;, Ben, is what gives us the right that no other audience had before: the right to demand that our role in the storytelling and creative process be honored and respected.&lt;br /&gt;
&lt;br /&gt;
Do I have a &lt;i&gt;legal&lt;/i&gt; right to demand a satisfying ending for Mass Effect 3? I don&#39;t think so, although &lt;a href=&quot;http://www.pcmag.com/article2/0,2817,2401775,00.asp&quot;&gt;at least one person disagrees&lt;/a&gt; with me.&lt;br /&gt;
&lt;br /&gt;
Do I have a &lt;i&gt;moral&lt;/i&gt; right? I believe I do. I paid BioWare and EA what I owed them. But whether they like it or not, they owe me, too.&lt;br /&gt;
&lt;br /&gt;
Yes, Ben, I &lt;i&gt;want&lt;/i&gt; to set a precedent, no matter how dangerous you think it is. I want BioWare to recognize the fact that their audience is not just an audience anymore: we&#39;re minor artists ourselves. We&#39;re storytellers. We watch the tree fall in the forest and then we go and build our houses from it.&lt;br /&gt;
&lt;br /&gt;
To people like &lt;a href=&quot;http://www.digitaltrends.com/gaming/exclusive-mass-effect-3s-director-addresses-the-games-controversies/&quot;&gt;Casey Hudson&lt;/a&gt; and &lt;a href=&quot;http://www.theverge.com/gaming/2012/3/19/2885173/the-argument-over-mass-effect-3s-ending-makes-ken-levine-sad&quot;&gt;Paul Barnett&lt;/a&gt;, these creations might not be as important as their own artistic integrity. But the least they and the rest of BioWare could do is not tear them down.&lt;br /&gt;
&lt;br /&gt;
So what&#39;ll it be, guys? &lt;a href=&quot;http://www.designersnotebook.com/Columns/049_What_Kind_of_Designer/049_what_kind_of_designer.htm&quot;&gt;What kind of designers are you?&lt;/a&gt;&lt;br /&gt;
&lt;div&gt;
&lt;br /&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;br /&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: left;&quot;&gt;
&lt;i&gt;&lt;br /&gt;&lt;/i&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: left;&quot;&gt;
&lt;i&gt;If you liked this, check out the follow-up: &lt;a href=&quot;http://beardseye.blogspot.com/2012/03/my-playthrough-on-your-fridge-door.html&quot;&gt;My Playthrough on Your Fridge Door&lt;/a&gt;&lt;/i&gt;.&lt;/div&gt;
&lt;/div&gt;</description><link>http://beardseye.blogspot.com/2012/03/are-players-just-audience.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>6</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-5076316697286446177</guid><pubDate>Tue, 08 Nov 2011 01:58:00 +0000</pubDate><atom:updated>2011-11-07T22:58:58.381-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">incompetence</category><category domain="http://www.blogger.com/atom/ns#">mediocrity</category><category domain="http://www.blogger.com/atom/ns#">rant</category><title>Old School or Just Old?</title><description>I’ve been looking for a diplomatic way to start this rant. After several false starts and a couple of glasses of wine, I realized that beating around the bush wouldn’t do it. Instead, I&#39;ll dispense with the bullshit and jump straight into what’s bothering me:&lt;br /&gt;
&lt;br /&gt;
&lt;div style=&quot;text-align: center;&quot;&gt;
&lt;b&gt;I’m tired of being immersed in a culture of mediocrity.&lt;/b&gt;&lt;/div&gt;
&lt;br /&gt;
There, I’ve said it. Now I have to spend the rest of the post explaining what the hell I meant by that. Don’t expect me to beat around the bush, either. If you disagree with what I say, fine, but I’m not going to sugar-coat what I think, because too much sugar-coating is part of the problem.&lt;br /&gt;
&lt;br /&gt;
I want to make one thing clear right from the start: I have a good job. There’s a lot to love about it. For one thing - the most important one, really - the work I do is interesting, challenging, ambitious and with plenty of opportunities to learn new things. It’s the kind of work that I often wished I could do and the fact that my wish came true is something to be happy about.&lt;br /&gt;
&lt;br /&gt;
Plus, the compensation is good enough to allow me and my family to live comfortably. To top it off, my boss is a nice dude, trusts me to do good work, helps me out when I need it and doesn’t make a fuss when I need to telecommute.&lt;br /&gt;
&lt;br /&gt;
Why, then, do I describe all that as a &quot;good job&quot;, instead of, say, &quot;dream job&quot;?&lt;br /&gt;
&lt;br /&gt;
I guess I could blame it all on my parents. They helped me discover computers and programming when I was six and it has been my passion since then. I’m almost thirty-three now and it has been a wonderful journey so far. Most of the stuff I can do, I discovered by trying to do it and looking for information on how it can be done. Not that I mean to imply I can do or learn anything. On the contrary, there have been things I tried to do or learn and gave up. The important thing, to me, is the fact that I acknowledged the fact that I gave up: for whatever reason, I chose not to expend the necessary effort.&lt;br /&gt;
&lt;br /&gt;
On that journey of discovery, I’ve had the privilege to meet quite a few brilliant people. The first time was in 1993. Having graduated as top of my class in the ground school, I sailed into my high school like Titanic and was promptly sunk by the realization that almost everyone was more brilliant than I was. It was one of the best times in my life.&lt;br /&gt;
&lt;br /&gt;
Another was when I went to work for Synopsys. It had taken me a bit more than seven years and almost as many jobs to find them. Once again, I was surrounded by brilliant people, all sharing their knowledge and pooling their skills to overcome nearly impossible challenges and improve themselves and the software they were developing.&lt;br /&gt;
&lt;br /&gt;
Like I said, I can blame it all on my parents. My dad used to say, &quot;Don’t go to the army if you don’t intend to be a general.&quot; By that he didn’t mean I should be unreasonably ambitious; rather, he meant I should always do my best and try to get better. The idea is to aim as high as you can, work hard and see where that gets you.&lt;br /&gt;
&lt;br /&gt;
That is why I get so pissed off by people who refuse to do their job without someone holding their hand. That is why I can’t fathom how people think they can change the code without understanding what it does or what they’re doing. That is why I can’t stand people who don’t even know the basic stuff anyone with their job description is required to know, but are quite happy to choose their own solution even after you patiently explain why they should do it differently.&lt;br /&gt;
&lt;br /&gt;
The worst thing, however, is that you’re not allowed to say &quot;No, I won’t have a half-hour NetMeeting session with you. You didn’t even &lt;i&gt;try&lt;/i&gt; to understand what it is you’re doing. You just blindly tried to apply a completely arbitrary set of changes and when that didn’t work, you gave up and decided you need someone to do it with you. And no, I won’t do it for you, either.&quot;&lt;br /&gt;
&lt;br /&gt;
Don’t get me wrong, I’m not ranting about the lack of knowledge. I’ve yet to see someone who was magically born knowing how to write code. I have no problem helping people who need a hand, either. What I’m railing against is the bad attitude, the kind you get in people who don’t even try. You can spot it right away, because they don’t even ask you questions. There’s a vast difference between:&lt;br /&gt;
&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
I tried doing X, but it didn’t work. Then I looked at Y and I think it does Z, but I’m not sure why P and Q aren’t there. Am I supposed to do A and B before X?&lt;/blockquote&gt;
&lt;br /&gt;
and:&lt;br /&gt;
&lt;br /&gt;
&lt;blockquote class=&quot;tr_bq&quot;&gt;
I tried doing X. It didn’t work and I don’t know why. Can we have a NetMeeting session so you can help me out?&lt;/blockquote&gt;
&lt;br /&gt;
To make things worse, managers often nurture that kind of attitude. For example, suppose you’re writing some code that has to do X and there’s a robust, mature open source library Y that helps you do X without reinventing the wheel. If you decide to use Y, then you’ve shackled yourself with maintenance of that bit of code forever. Whenever someone gets the task of doing something with that code, all they have to do is complain that they don’t know how to use Y and the task will most likely get reassigned to you. It doesn’t matter that you didn’t know how to use Y, either, before you decided to use it to do X. Apparently, you’re the only one who can learn how to use an open source library whose documentation is freely available out there.&lt;br /&gt;
&lt;br /&gt;
That’s what I was referring to by &quot;culture of mediocrity&quot;. It’s not just the belief that you don’t even have to try to learn something or expend some additional effort, it’s the belief you &lt;i&gt;shouldn’t&lt;/i&gt; have to try. It’s the belief that being given a job is the proof that you know enough and if you don’t, it’s the fault of whoever gave you the job. It’s the belief that nobody has the right to expect you to go an inch beyond that. In extreme cases, it’s the belief that going beyond is only going to get you saddled with more work, so you’re a fool to do it.&lt;br /&gt;
&lt;br /&gt;
By all means, don’t take this as a criticism of my workplace, specifically. It’s not endemic to the company I work for. Rather, it’s widespread among the BigCo type of companies. I’m not sure why that happens and why there are so few non-startup companies, like Google or Synopsys, that don’t allow that to happen on a grand scale. Frankly, I’m not in the mood to speculate about it, either.&lt;br /&gt;
&lt;br /&gt;
Things aren’t likely to change, no matter what I say here. For my part, I refuse to adapt to that situation. I’ll take pleasure in working with the few brilliant people scattered around in the sea of mediocrity that surrounds us. I’ll grumble and groan about the rest, until the work stops being interesting, the pay falls below the comfort threshold or the nice boss gets replaced by someone not so nice; in short, until it&#39;s time to move on.&lt;br /&gt;
&lt;br /&gt;
I like to think of myself as &quot;old school&quot;, since I’ve gone from writing spaghetti code in ZX Spectrum’s BASIC and even worse Z80A assembler code to writing my own DSLs and compiling them to JVM.&amp;nbsp; But maybe that’s not it. After all, I can often name the song playing in an elevator or a mall. I remember when MTV actually had music videos playing all the time. One of my favorite bands broke up after thirty years of playing together. I used to have a BBS that was a point in the FidoNet. Maybe I’m not &quot;old school&quot;, but simply old.&lt;br /&gt;
&lt;br /&gt;
Whatever. If you can’t be bothered to even try to learn new things, then get off my lawn.</description><link>http://beardseye.blogspot.com/2011/11/old-school-or-just-old.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>2</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-5093412607320985006</guid><pubDate>Thu, 12 Nov 2009 12:37:00 +0000</pubDate><atom:updated>2009-11-12T10:56:44.411-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">scrum politics</category><title>Scrum and Politics</title><description>Here&#39;s why so many people hate the Agile: consultants. Yes, many of them are nice people who truly believe in what they&#39;re teaching. Unfortunately, this doesn&#39;t change the fact that, at the end of the day, they get to walk away and you&#39;re stuck with whatever they did or failed to do.&lt;br /&gt;&lt;br /&gt;Think about the process consultants for a moment: their job is to come to someone else&#39;s workplace and make them change the way they do things. They don&#39;t come uninvited, though; you have to invite them in, just like vampires. Once they&#39;re inside, they have to make things work the new way. Invariably, they will encounter resistance from people who are comfortable doing things the old way. They have to know how to filter out legitimate idiosyncrasies of a particular workplace from loads of bullshit coming their way. In other words, they have to be well-versed in workplace politics.&lt;br /&gt;&lt;br /&gt;Above all, though, a process consultant has to demonstrate success, because that&#39;s what he&#39;s selling. The project has to be successful or the consultant&#39;s company loses face and money. And that&#39;s the bottom line: the project will be perceived as successful, no matter what. That&#39;s also the top reason why a consultant needs to be a good &quot;office politician&quot;, because he has to turn casualties into &quot;collateral damage&quot;.&lt;br /&gt;&lt;br /&gt;Don&#39;t get me wrong. Like I said, a lot of those people are essentially nice; they don&#39;t do this stuff out of malice, but out of belief that they&#39;re helping. What I&#39;m trying to explain here is that the road to hell really is paved with good intentions.&lt;br /&gt;&lt;br /&gt;Why am I singling out Agile, though? Mainly because of the Scrum. When it comes to project management, Agile consultants will always reach for Scrum and that&#39;s the source of more than half of the trouble they bring. I hereby postulate that Scrum is especially vulnerable to office politics.&lt;br /&gt;&lt;br /&gt;&lt;div style=&quot;text-align: center;&quot;&gt;&lt;div style=&quot;text-align: left;&quot;&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;The Secret of Scrum&lt;/span&gt;&lt;br /&gt;&lt;/div&gt;&lt;/div&gt;&lt;div style=&quot;text-align: right;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;&lt;span style=&quot;font-size:78%;&quot;&gt;“If you can&#39;t get rid of the skeleton in your closet,&lt;br /&gt;you&#39;d best teach it to dance.”&lt;br /&gt;George Bernard Shaw&lt;/span&gt;&lt;br /&gt;&lt;/span&gt;&lt;/div&gt;&lt;br /&gt;A typical sales pitch for Scrum is based on the comparison with the &quot;traditional waterfall&quot; process. Don&#39;t let the consultants fool you: incremental and iterative development is not unique to Scrum. Neither is the idea that you should ship often. What really sets Scrum apart is a very simple idea: free the developers.&lt;br /&gt;&lt;br /&gt;The whole &lt;a href=&quot;http://www.implementingscrum.com/2006/09/11/the-classic-story-of-the-pig-and-chicken/&quot;&gt;&quot;chicken and pig&quot;&lt;/a&gt; philosophy, the existence of a Scrum Master who focuses on &quot;removing impediments&quot;, the emphasis on self-organizing teams -- all of that aims to clear all the typical obstacles that a team faces when trying to develop a product. The theory is that, once these obstacles are out of the way, the team will be free to develop a great product. That&#39;s why they have all these people at their disposal, right? The team focuses on building great stuff and everyone else focuses on giving the team what they need. And then they all live happily ever after...&lt;br /&gt;&lt;br /&gt;The most fascinating thing about it is that it really can and does work. I&#39;ve seen a few teams that pulled it off. I&#39;ve also seen a few that could have pulled it off, but the management wasn&#39;t inclined to let them try. The one thing that the consultants won&#39;t tell you or your bosses, however, is that not all teams &lt;span style=&quot;font-style: italic;&quot;&gt;can&lt;/span&gt; pull it off. That&#39;s Scrum&#39;s dirty, dark little secret. If you have a process that requires a self-organizing team, then it should be obvious that your team needs to be capable of organizing itself. It&#39;s surprising to see how many people neglect this issue.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Growing Up&lt;/span&gt;&lt;br /&gt;&lt;div style=&quot;text-align: right;&quot;&gt;&lt;span style=&quot;font-size:78%;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;“A person&#39;s maturity consists in having found again&lt;br /&gt;the seriousness one had as a child, at play”&lt;/span&gt;&lt;br /&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;Friedrich Nietzsche&lt;/span&gt;&lt;/span&gt;&lt;br /&gt;&lt;/div&gt;&lt;br /&gt;Let me spell it out clearly: self organization is a very difficult skill to obtain and maintain. The typical response to this concern is that maturity can be achieved in time and that this &quot;only&quot; implies a ramp-up period. While I agree with that sentiment, this is usually the point where consultants screw things up: typically the ramp-up period is one sprint or so and there&#39;s not enough support for the team. You can&#39;t just dump it on a team and expect it to &quot;ramp up&quot; to it. Sure, every good team wants to be free, but that doesn&#39;t mean they know &lt;span style=&quot;font-style: italic;&quot;&gt;how&lt;/span&gt; to be free. It&#39;s an ironic, but not unusual situation to obtain the freedom you&#39;ve always wanted, only to realize that you don&#39;t know what to do with it.&lt;br /&gt;&lt;br /&gt;&lt;a onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEickq5khqCiskYBRfWY64iiUu3DfzRBXqAI-mz7rBZKbmYBrF0RUDZ6ZSVGH_ddCwtE9XFrJWG2qLYc-FlqnJMu3Of9_ox-8aJF2VvXDC_YRiTBjRqKemIQpyFCBIK9ATTnCUqv2a7cZkA/s1600-h/calvin_headstand.gif&quot;&gt;&lt;img style=&quot;margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 320px; height: 99px;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEickq5khqCiskYBRfWY64iiUu3DfzRBXqAI-mz7rBZKbmYBrF0RUDZ6ZSVGH_ddCwtE9XFrJWG2qLYc-FlqnJMu3Of9_ox-8aJF2VvXDC_YRiTBjRqKemIQpyFCBIK9ATTnCUqv2a7cZkA/s320/calvin_headstand.gif&quot; alt=&quot;&quot; id=&quot;BLOGGER_PHOTO_ID_5403213651753456818&quot; border=&quot;0&quot; /&gt;&lt;/a&gt;&lt;br /&gt;Maturity, both for teams and their members, means many things. It&#39;s not just about adapting to the fact that you won&#39;t have a manager telling you what to do. It&#39;s also about having enough context. It&#39;s amazing how little context a typical software developer needs, as a minimum, in order to do his or her bit and live happily oblivious of the bigger picture. Many developers out there know a lot about the data model of the underlying systems or about this application server or that legacy code, but don&#39;t really understand what the products really do and what makes them successful. If you put such people on an important project and tell them to organize themselves, they&#39;ll be like fish out of water. They know how to attack technical challenges, but they don&#39;t know how to make the project succeed. This issue is also easily addressed, at least in theory: ensure that the Product Owner guides the team. In practice, however, the person who should be the Product Owner is often too busy to participate properly. In such cases, you either have an absentee Product Owner or a surrogate whose contributions are of dubious quality.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Control Chaos, But Avoid Anarchy&lt;/span&gt;&lt;br /&gt;&lt;div style=&quot;text-align: right;&quot;&gt;&lt;span class=&quot;sqq&quot;  style=&quot;font-size:78%;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;“It is best to do things systematically, since we are only humans,&lt;br /&gt;and disorder is our worst enemy”&lt;/span&gt;&lt;/span&gt;&lt;span style=&quot;font-size:78%;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;span class=&quot;sqq&quot;  style=&quot;font-size:78%;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;Hesiod&lt;/span&gt;&lt;/span&gt;&lt;br /&gt;&lt;/div&gt;&lt;br /&gt;Yet another critical mistake some consultants will make is to take the idea of cross functional teams to an extreme. Personally, I think this mistake might be inspired by the name &quot;scrum&quot;, which comes from rugby. The whole approach was initially inspired by a paper published by Hirotaka Takeuchi and Ikujiro Nonaka, called &lt;a href=&quot;http://apln-richmond.pbworks.com/f/New%20New%20Prod%20Devel%20Game.pdf&quot;&gt;&quot;The New New Product Development Game&quot;&lt;span style=&quot;font-size:85%;&quot;&gt;[pdf]&lt;/span&gt;&lt;/a&gt;. In the paper, the waterfall approach is compared to a relay race and the new approach is compared to rugby, &quot;where a team tries to go the distance as a unit, passing the ball back and forth.&quot; Other people later named it Scrum to emphasize the idea that, just like in &lt;a href=&quot;http://www.youtube.com/watch?v=hFXHRb-kHt8&quot;&gt;rugby scrum&lt;/a&gt;, all team members must &quot;push&quot; together to overcome the challenges.&lt;br /&gt;&lt;br /&gt;Ironically, people who put too much emphasis on this aspect tend to forget the bit about &quot;passing the ball back and forth&quot;. Team members all have roles and they were put in those roles for a reason. Working together in a cross-functional team means that the team is composed of different functions that cooperate smoothly, not that every team member should be called to perform any function at any moment, even if they are capable of doing so. There&#39;s a reason you hired someone to be a QA analyst in your company. It doesn&#39;t matter if they happen to have the skills to do some other job too; unless they&#39;ve been doing that other job -- in your company -- long enough to feel comfortable in it, they can&#39;t do that other job on as well as a person whose job it really is. On top of that, the money you&#39;re paying someone must have at least something to do with the job they&#39;re doing and with their contractual responsibilities. It&#39;s bad form to piss all over that in the name of &quot;cross-functional teamwork&quot;.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Wagging The Dog&lt;/span&gt;&lt;br /&gt;&lt;div style=&quot;text-align: right;&quot;&gt;&lt;span style=&quot;font-size:78%;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;“Those who say religion has nothing to do with politics&lt;br /&gt;do not know what religion is.”&lt;/span&gt; &lt;span style=&quot;font-style: italic;&quot;&gt;&lt;br /&gt;Mahatma Gandhi&lt;/span&gt;&lt;/span&gt;&lt;br /&gt;&lt;/div&gt;&lt;br /&gt;What does all this have to do with my initial claim that Scrum is especially vulnerable to office politics? Think about it: all of these issues can be, and often are, highly political. Saying that your team lacks maturity can be seen as an admission of failure. Alternately, it can single you out as the &quot;negative thinker&quot; or &quot;that guy who opposes the change&quot;. And let&#39;s not forget that we&#39;re talking about a group of people here and that even if some of them are ready to admit the lack of maturity, others might take offense at this. All these are great incentives to sweep the problem under the carpet.&lt;br /&gt;&lt;br /&gt;Maturity isn&#39;t the only political issue here, either. Take the Product Owner, for example. If the only person who knows enough to be the Product Owner is too high in the food chain, this means that not only he or she won&#39;t be able to participate as required, it also guarantees that nobody will want to ask them for guidance as often as necessary.&lt;br /&gt;&lt;br /&gt;And heaven help you if you run into a consultant who&#39;s a &quot;cross-functional extremist&quot;. Since Agile is all about moving away from &quot;heavyweight&quot; model, if you dare to pipe up about this issue you&#39;ll be accused of &quot;insisting on ceremony&quot; and &quot;placing too much importance on titles&quot;. Do you detect the odor of politics yet?&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Make It Work&lt;/span&gt;&lt;br /&gt;&lt;div style=&quot;text-align: right;&quot;&gt;&lt;span style=&quot;font-size:78%;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;“Organizing is what you do before you do something,&lt;br /&gt;so that when you do it, it&#39;s not all mixed up.”&lt;/span&gt; &lt;span style=&quot;font-style: italic;&quot;&gt;&lt;br /&gt;A. A. Milne&lt;/span&gt;&lt;/span&gt;&lt;br /&gt;&lt;/div&gt;&lt;br /&gt;Okay, so Scrum can be thoroughly screwed up by office politics. So what else is new? It&#39;s easy to point out the problem, let&#39;s hear some solutions. As always, there&#39;s no silver bullet. There are, however, ways to avoid some common pitfalls.&lt;br /&gt;&lt;br /&gt;First and foremost: help the team reach the necessary maturity. This is not a trivial task and it should not be underestimated. Remember all that time you&#39;re supposed to save by having a lightweight process with minimal overhead? Well, that doesn&#39;t come for free. Initially, you might have to invest a hell of a lot of time and effort into getting the team up to the point where it can truly be self-organized. Make sure that the right people are in charge of this effort. The person doing this should ideally be a project manager, hopefully in the Scrum Master role. Whatever you do, don&#39;t just dump this responsibility into the lap of whoever seems to be convenient. For example, your architects might seem like the good choice of being &quot;team leads&quot; who can take care of this, but you should remember that there&#39;s a reason you hired them as architects: they&#39;ll be busy making sure everything fits in the &quot;big picture&quot; on the technical side of things; yes, this does include guiding other team members, but not organizing and managing them.&lt;br /&gt;&lt;br /&gt;Second, don&#39;t mistake teamwork and cross-functional approach for anarchy. The team needs to work in an organized way, which means that roles are there for a reason. Self-organization isn&#39;t a democracy, either. Everyone should be free to contribute, but if the key decisions about your product are made by &quot;the majority&quot;, you get &lt;a href=&quot;http://c2.com/cgi/wiki?DesignByCommittee&quot;&gt;Design By Committee&lt;/a&gt; and we all know how well that usually turns out. In good teams, decisions are made by people who are the best fit for making them and there are always conflict-resolution mechanisms in place.&lt;br /&gt;&lt;br /&gt;Third, always maintain a measure of stability. Agile is all about adapting to changes quickly, but &quot;change&quot; here has a specific meaning; it refers to evolving requirements, not just any kind of change. You&#39;re already pushing a big change by switching to Scrum and you also know that your project requirements are bound to evolve over time, so that means your team will be facing changes on two fronts. Try to keep the rest of their environment as stable as possible. If you really have to introduce changes, do so gradually. Changes in team structure -- such as dividing the team into smaller teams to avoid crowding the daily scrum or merging two smaller teams into a larger one -- can be very disruptive. The lack of stability might easily demoralize your team, so that you end up losing more than what you hoped to gain by restructuring.&lt;br /&gt;&lt;br /&gt;So far, I&#39;ve talked about measures that could be taken by anyone with enough authority to see them through. The last piece of advice I&#39;d like to offer applies to consultants exclusively. It&#39;s something that should be completely unnecessary to say, yet it seems that some consultants might need to be told this: &lt;span style=&quot;font-weight: bold;&quot;&gt;be there&lt;/span&gt;. Yes, I know that you&#39;re just one consultant and you&#39;re bringing change to the whole company and you have to go scurrying hither and yon to assess all the risks and ensure success and whatnot. But if you&#39;re not &lt;span style=&quot;font-weight: bold;&quot;&gt;there&lt;/span&gt; to support your team all they need, they -- and their project -- are going to suffer. They&#39;re under pressure from all sides: they have an important project which must succeed and they&#39;re expected to make this happen in a new way, using a new process. Be there for your team, listen carefully to their needs and do whatever you can to help them. You&#39;re the one who understands how things should be done, so make sure that the team can rely on you. It&#39;s not enough to tell them you&#39;re there to help them; actions speak louder than words and if you&#39;re not there, you&#39;ll lose their trust quickly. And it all goes downhill from there.&lt;br /&gt;&lt;br /&gt;Sounds hard? Welcome to the real world, where there ain&#39;t no such thing as a free lunch.</description><link>http://beardseye.blogspot.com/2009/11/heres-why-so-many-people-hate-agile.html</link><author>noreply@blogger.com (Anonymous)</author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEickq5khqCiskYBRfWY64iiUu3DfzRBXqAI-mz7rBZKbmYBrF0RUDZ6ZSVGH_ddCwtE9XFrJWG2qLYc-FlqnJMu3Of9_ox-8aJF2VvXDC_YRiTBjRqKemIQpyFCBIK9ATTnCUqv2a7cZkA/s72-c/calvin_headstand.gif" height="72" width="72"/><thr:total>2</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-7165478365621572059</guid><pubDate>Mon, 23 Feb 2009 21:30:00 +0000</pubDate><atom:updated>2011-11-07T22:27:03.338-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">.NET</category><category domain="http://www.blogger.com/atom/ns#">C#</category><category domain="http://www.blogger.com/atom/ns#">reflection</category><title>Reflection Recipes</title><description>Have you ever seen that cartoon in which Speedy Gonzales guides Daffy Duck across a minefield?&lt;br /&gt;
&lt;blockquote&gt;
BOOM! There&#39;s one! BOOM! There&#39;s another! What do you mean you don&#39;t know where they are? You haven&#39;t missed one yet!&lt;/blockquote&gt;
&lt;br /&gt;
I&#39;ve been plumbing the depths of .NET reflection lately and it felt just like Daffy&#39;s trip: I knew where I was and where I wanted to be, but I hit every landmine on my way there. It&#39;s not Microsoft&#39;s fault, really. The .NET reflection is a powerful beast and covering every little &quot;gotcha&quot; is a heroic work, one that the guys in charge of MSDN almost pulled off. Nearly everything you&#39;ll need to survive is &lt;span style=&quot;font-style: italic;&quot;&gt;somewhere&lt;/span&gt; in the library. If you&#39;re trying to do something simple, you&#39;ll be fine; if you&#39;re trying to pull off something advanced, you&#39;ll need a map of the minefield. Here are some of the mines I hit so far.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Attributes: Not What It Looks Like, Honey&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
It&#39;s easy to think of attributes as something that gets attached to other elements of your code when it&#39;s compiled. Unfortunately, it&#39;s also incorrect. Consider the following code:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;[AttributeUsage(AttributeTargets.Method, AllowMultiple = false, Inherited = true)]
public class UniqueIDAttribute : Attribute
{
    private static int nextID = 1;

    public int ID { get; private set; }

    public UniqueIDAttribute()
    {
        ID = nextID++;
    }
}

public class Base
{
    [UniqueID]
    public virtual void MyMethod() { }
}

public class Derived : Base
{
    public override void MyMethod()
    {
        base.MyMethod();
    }
}

class Program
{
    static void Main(string[] args)
    {
        UniqueIDAttribute attr;
        attr = (UniqueIDAttribute) Attribute.GetCustomAttribute(
        typeof(Derived).GetMethod(&quot;MyMethod&quot;),
        typeof(UniqueIDAttribute));
        Console.Write(attr.ID);
        Console.Write(&quot; &quot;);
        attr = (UniqueIDAttribute) Attribute.GetCustomAttribute(
        typeof(Base).GetMethod(&quot;MyMethod&quot;),
        typeof(UniqueIDAttribute));
        Console.WriteLine(attr.ID);
    }
}&lt;/pre&gt;
&lt;br /&gt;
I was honestly surprised that the output of this was &quot;1 2&quot; instead of &quot;1 1&quot; I expected. I thought that placing a custom attribute on a method will attach an instance of that attribute to that method. This is not feasible for a variety of good reasons; for example, how do you handle a case of an attribute placed on a member of a generic class? Most of these problems could be worked around with &lt;span style=&quot;font-style: italic;&quot;&gt;lots&lt;/span&gt; of effort, but Microsoft decided to do things the easy way and invest that effort somewhere else.&lt;br /&gt;
&lt;br /&gt;
Here&#39;s what happens when you place an attribute on a code element: the compiler takes all the information necessary to construct that attribute and emits it together with that element. When you query the attribute at run time, the reflection code pulls out this information and uses it to construct the instance of the attribute at that moment. As an example, consider the following code:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;[AttributeUsage(AttributeTargets.Method)]
public class SampleAttribute : Attribute
{
    public int Value { get; protected set; }
    public string Comment { get; set; }

    public SampleAttribute(int value)
    {
        Value = value;
    }
}

public class SomeClass
{
    [Sample(17, Comment = &quot;Some text&quot;)]
    public void SomeMethod() { }
}&lt;/pre&gt;
&lt;br /&gt;
Now take a look at the IL for &lt;span class=&quot;code&quot;&gt;SomeMethod&lt;/span&gt;:&lt;br /&gt;
&lt;pre class=&quot;brush: text&quot;&gt;.method public hidebysig instance void  SomeMethod() cil managed
{
.custom instance void Sandbox.CSharp.SampleAttribute::.ctor(int32) = ( 
               01 00 11 00 00 00 01 00 54 0E 07 43 6F 6D 6D 65   // ........T..Comme
               6E 74 09 53 6F 6D 65 20 74 65 78 74 )             // nt.Some text
// Code size       2 (0x2)
.maxstack  8
IL_0000:  nop
IL_0001:  ret
} // end of method Base::SomeMethod&lt;/pre&gt;
&lt;br /&gt;
As you can see, the method metadata contains custom attribute information that tells the runtime which constructor to call, what parameter to supply to it and which property to set to which value.&lt;br /&gt;
&lt;br /&gt;
Having understood all that, it&#39;s no problem to understand why the &lt;a href=&quot;http://msdn.microsoft.com/en-us/library/system.reflection.emit.customattributebuilder.aspx&quot;&gt;&lt;span class=&quot;code&quot;&gt;CustomAttributeBuilder&lt;/span&gt; class&lt;/a&gt; exists or how to use it.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Non-Virtual Interface: The Dark Secret&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
Have you ever wondered how it is that you can implement an interface method in a class, without marking it as virtual? It had been bothering me since I learned C#, but I never really had a compelling reason to solve that particular mystery. It was enough to know that it Just Worked. At least, it was until I tried to dynamically create a class and make it implement an interface too.&lt;br /&gt;
&lt;br /&gt;
I&#39;ll stop here to make a quick aside for readers who know only C# (and/or Java) and might wonder why I think that interface methods should necessarily be virtual. A virtual method is late-bound: you don&#39;t know at compile time which implementation will be invoked when someone calls the method. Because of that, there&#39;s a layer of indirection between the method declaration and the method body; the class that defines the body of a virtual method -- either for the first time or by overriding it -- &quot;configures&quot; this layer of indirection to &quot;point&quot; to that particular body. An interface is nothing but a bunch of abstract (and therefore virtual) methods. That&#39;s why it seems counter-intuitive to implement an interface method without specifying the &quot;virtual&quot; keyword.&lt;br /&gt;
&lt;br /&gt;
If you try to run the following code, you&#39;ll get an interesting exception:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;public interface IMyInterface
{
    void MyMethod();
}

class Program
{
    static void Main(string[] args)
    {
        AssemblyBuilder assembly = AppDomain.CurrentDomain.DefineDynamicAssembly(
            new AssemblyName() { Name = &quot;MyAssembly&quot; },
            AssemblyBuilderAccess.Run);
        ModuleBuilder module = assembly.DefineDynamicModule(&quot;MyModule&quot;);
        TypeBuilder myClass = module.DefineType(
            &quot;MyClass&quot;,
            TypeAttributes.Class | TypeAttributes.Public,
            typeof(Object),
            new Type[] { typeof(IMyInterface) });
        MethodBuilder myMethod = myClass.DefineMethod(
            &quot;MyMethod&quot;,
            MethodAttributes.Public | MethodAttributes.HideBySig,
            typeof(void),
            Type.EmptyTypes);
        ILGenerator il = myMethod.GetILGenerator();
        il.Emit(OpCodes.Ret);
        Type myClassType = myClass.CreateType();
    }
}&lt;/pre&gt;
&lt;br /&gt;
The exception you&#39;ll get is &lt;span class=&quot;code&quot;&gt;TypeLoadException&lt;/span&gt; and it&#39;ll be complaining about &lt;span class=&quot;code&quot;&gt;MyClass&lt;/span&gt; not implementing &lt;span class=&quot;code&quot;&gt;MyMethod&lt;/span&gt;. If you add the &lt;span class=&quot;code&quot;&gt;MethodAttributes.Virtual&lt;/span&gt; flag to the method attributes, the problem will disappear.&lt;br /&gt;
&lt;br /&gt;
But what happens if you don&#39;t want &lt;span class=&quot;code&quot;&gt;MyMethod&lt;/span&gt; to be virtual? That&#39;s not really an option. For the reasons I explained above, the method &lt;span style=&quot;font-style: italic;&quot;&gt;must&lt;/span&gt; be virtual. What you might want is to make sure it cannot be overridden. To do that, you have to also add the &lt;span class=&quot;code&quot;&gt;MethodAttributes.Final&lt;/span&gt; flag to the method attributes. That will give you the equivalent of writing:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;public class MyClass : IMyInterface
{
    public void MyMethod() { }
}&lt;/pre&gt;
&lt;br /&gt;
The funny thing is that you can&#39;t declare a method as &lt;span class=&quot;code&quot;&gt;virtual sealed&lt;/span&gt;, yet that&#39;s exactly what the compiler does under the hood. I guess it&#39;s just one of those trade-offs: either you use a slightly leaky abstraction or you risk antagonizing people by making them learn the underlying concepts thoroughly.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Naming Your Type: Basic Hygiene&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
Isn&#39;t it funny how something as useful as the string type can also cause so much trouble? Sooner or later everyone has to learn to sanitize their strings or risk having their software let the little Bobby Tables wreak havoc on their data:&lt;br /&gt;
&lt;a href=&quot;http://imgs.xkcd.com/comics/exploits_of_a_mom.png&quot;&gt;&lt;img alt=&quot;&quot; border=&quot;0&quot; src=&quot;http://imgs.xkcd.com/comics/exploits_of_a_mom.png&quot; style=&quot;cursor: pointer; display: block; height: 126px; margin: 0px auto 10px; text-align: center; width: 410px;&quot; /&gt;&lt;/a&gt;&lt;br /&gt;
I must admit, though, that I didn&#39;t expect to have to sanitize the name when creating a type at run time. I found out, quite unexpectedly, that &lt;span class=&quot;code&quot;&gt;TypeBuilder&lt;/span&gt; seems to be allergic to types whose names contain a comma. Not that it blows up right away. You can use a name riddled with commas and still be fine when you create the type. What you can&#39;t do is use that type as a parent type. If you do, &lt;span class=&quot;code&quot;&gt;TypeBuilder&lt;/span&gt; will blow up with a rather unhelpful &lt;span class=&quot;code&quot;&gt;COMException&lt;/span&gt; saying &quot;Record not found on lookup.&quot;&lt;br /&gt;
&lt;br /&gt;
If you google for it, you&#39;ll eventually find out that there&#39;s a bug in reflection, supposedly triggered by strings that contain any of the characters &lt;span class=&quot;code&quot;&gt;[]*&amp;amp;+,\&lt;/span&gt; in the type name. My experiments indicate that the only problematic character seems to be the comma, but I&#39;m getting rid of all of those characters anyway.&lt;br /&gt;
&lt;br /&gt;
It&#39;s a nasty little bug and it&#39;s not documented in MSDN. I figured I should warn people about it, since it&#39;s such a &lt;a href=&quot;http://www.sluggy.com/daily.php?date=990228&quot;&gt;comma mistake&lt;/a&gt; after all.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Metadata Tokens: Remembering Past Lives&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
There are two stages in the life of a dynamic class: before and after the call to &lt;span class=&quot;code&quot;&gt;CreateType&lt;/span&gt;. In the first stage, you&#39;re using the &lt;span class=&quot;code&quot;&gt;TypeBuilder&lt;/span&gt; to define type features, such as fields, properties, methods, nested types, etc. When you&#39;re done defining your type, you call the &lt;span class=&quot;code&quot;&gt;CreateType&lt;/span&gt; method and the magic reflection fairy turns your &lt;span class=&quot;code&quot;&gt;TypeBuilder&lt;/span&gt; into a real live &lt;span class=&quot;code&quot;&gt;Type&lt;/span&gt;. But your little Pinocchio can either be a marionette or a real boy, and never the twain shall meet.&lt;br /&gt;
&lt;br /&gt;
In practice, this means that a &lt;span class=&quot;code&quot;&gt;FieldBuilder&lt;/span&gt; you get from calling &lt;span class=&quot;code&quot;&gt;DefineField&lt;/span&gt; on your &lt;span class=&quot;code&quot;&gt;TypeBuilder&lt;/span&gt; can only be used to define the field. Even though the &lt;span class=&quot;code&quot;&gt;FieldBuilder&lt;/span&gt; inherits from &lt;span class=&quot;code&quot;&gt;FieldInfo&lt;/span&gt;, you cannot use it to get or set the field value on the instances of the type obtained by calling &lt;span class=&quot;code&quot;&gt;CreateType&lt;/span&gt; on the &lt;span class=&quot;code&quot;&gt;TypeBuilder&lt;/span&gt;. If you try to do so, you&#39;ll get a &lt;span class=&quot;code&quot;&gt;NotSupportedException&lt;/span&gt;, informing you that the &quot;invoked member is not supported in a dynamic module&quot;. Same thing applies to other builder classes.&lt;br /&gt;
&lt;br /&gt;
Looking around the documentation, you&#39;ll see that the usual workaround seems to be to look up the member by its name in the newly created type. That can get ugly, though. For example, consider what you would have to do to look up a one of the several methods with the same name. The infuriating thing is that you already &lt;span style=&quot;font-style: italic;&quot;&gt;have&lt;/span&gt; the &lt;span class=&quot;code&quot;&gt;MethodBuilder&lt;/span&gt; for it, which is the equivalent of a &lt;span class=&quot;code&quot;&gt;MethodInfo&lt;/span&gt; in its &lt;span class=&quot;code&quot;&gt;TypeBuilder&lt;/span&gt;, so there &lt;span style=&quot;font-style: italic;&quot;&gt;should&lt;/span&gt; be a relatively painless way of getting its counterpart in a live Type.&lt;br /&gt;
&lt;br /&gt;
It turns out that there is a rather painless way of doing it and it&#39;s not even a dirty hack. Every type and every member in has a metadata token that identifies it uniquely within its module. If you have the metadata token for a method, you can resolve it into a &lt;span class=&quot;code&quot;&gt;MethodBase&lt;/span&gt; by calling &lt;span class=&quot;code&quot;&gt;ResolveMethod&lt;/span&gt; on the method&#39;s &lt;span class=&quot;code&quot;&gt;Module&lt;/span&gt;. The usual way of obtaining a metadata token is via the &lt;span class=&quot;code&quot;&gt;MetadataToken&lt;/span&gt; property defined in the &lt;span class=&quot;code&quot;&gt;MemberInfo&lt;/span&gt; class. That won&#39;t work on a builder class, though. You&#39;ll get an &lt;span class=&quot;code&quot;&gt;InvalidOperationException&lt;/span&gt; complaining that the &quot;operation is not valid due to the current state of the object&quot;.&lt;br /&gt;
&lt;br /&gt;
In order to get the metadata token from a builder class, you have to use a corresponding method in the &lt;span class=&quot;code&quot;&gt;ModuleBuilder&lt;/span&gt; class. For example, if you have a &lt;span class=&quot;code&quot;&gt;MethodBuilder&lt;/span&gt; and want to get its metadata token, you have to call &lt;span class=&quot;code&quot;&gt;GetMethodToken&lt;/span&gt; on its &lt;span class=&quot;code&quot;&gt;ModuleBuilder&lt;/span&gt;. When you put it all together, converting a &lt;span class=&quot;code&quot;&gt;MethodBuilder&lt;/span&gt; for a TypeBuilder into its counterpart &lt;span class=&quot;code&quot;&gt;MethodInfo&lt;/span&gt; for the resulting live &lt;span class=&quot;code&quot;&gt;Type&lt;/span&gt; can be done like this:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;public static MethodInfo GetLiveMethod(MethodBuilder method)
{
    return (MethodInfo) method.Module.ResolveMethod(
        ((ModuleBuilder) method.Module).GetMethodToken(method).Token);
}&lt;/pre&gt;
&lt;br /&gt;
Clean, concise and probably more efficient.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Generic Parameters: Working Around The Taint&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
When I was checking out Ruby, I ran into this neat concept of taint. It stuck with me, probably because I read about it in &lt;a href=&quot;http://www.rubycentral.com/pickaxe/taint.html&quot;&gt;The Pragmatic Programmer&#39;s Guide&lt;/a&gt;, whose style is rather memorable. And, of course, because the concept is so neat.&lt;br /&gt;
&lt;br /&gt;
What, then, is taint? Besides being &quot;the opposite of tis&quot;, &#39;tis a way of keeping little Bobby Tables out of mischief. The idea is simple: anything coming from the outside user is considered tainted and anything derived from a tainted value is also considered tainted; if taint checking is active, the tainted values cannot be used for anything dangerous, such as SQL queries or host operating system commands.&lt;br /&gt;
&lt;br /&gt;
What does this have to do with .NET reflection? The generic parameters for dynamic types and dynamic methods behave in a similar fashion. Specifically, if you call an otherwise innocent method &lt;span class=&quot;code&quot;&gt;MakeGenericType&lt;/span&gt; on an existing generic &lt;span class=&quot;code&quot;&gt;Type&lt;/span&gt;, you&#39;ll wind up with a &lt;span class=&quot;code&quot;&gt;Type&lt;/span&gt; with limited capabilities. Calls like &lt;span class=&quot;code&quot;&gt;GetField&lt;/span&gt; will fail on such a &lt;span class=&quot;code&quot;&gt;Type&lt;/span&gt;.&lt;br /&gt;
&lt;br /&gt;
The workaround is actually documented in MSDN, which is why I saved this particular landmine for the end. If you&#39;ve already read this far, chances are you&#39;re going to read and remember this solution before you actually run into the problem.&lt;br /&gt;
&lt;br /&gt;
Here&#39;s what you have to do instead of calling &lt;span class=&quot;code&quot;&gt;GetField&lt;/span&gt; on the &quot;tainted&quot; &lt;span class=&quot;code&quot;&gt;Type&lt;/span&gt;. First get the &lt;span class=&quot;code&quot;&gt;FieldInfo&lt;/span&gt; from the generic type itself, as in:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;FieldInfo myField = typeof(MyGenericType&amp;lt;&amp;gt;).GetField(&quot;MyField&quot;);&lt;/pre&gt;
&lt;br /&gt;
Then, get the &quot;tainted&quot; type:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;Type constructedType = typeof(MyGenericType&amp;lt;&amp;gt;).MakeGenericType(myGenericParamBuilder);&lt;/pre&gt;
&lt;br /&gt;
Finally, do the workaround magic:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;myField = TypeBuilder.GetField(constructedType, myField);&lt;/pre&gt;
&lt;br /&gt;
Rinse and repeat until you run out of &quot;taint&quot;.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Further KABOOM Avoidance&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
The .NET reflection is really powerful and flexible, but it can be a real pain in the backside if you&#39;re not careful. Now that C# 3 is here, you can opt to use the &lt;span class=&quot;code&quot;&gt;Expression&amp;lt;TDelegate&amp;gt;&lt;/span&gt; instead of old &lt;span class=&quot;code&quot;&gt;DynamicMethod&lt;/span&gt;, but the old-style &lt;span class=&quot;code&quot;&gt;TypeBuilder&lt;/span&gt; is not likely to go anywhere soon. I hope that this map will help you get across the minefield without getting blown up often.</description><link>http://beardseye.blogspot.com/2009/02/reflection-recipes.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>4</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-5571726211329093332</guid><pubDate>Wed, 14 Jan 2009 14:29:00 +0000</pubDate><atom:updated>2011-11-07T22:33:11.322-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">.NET</category><category domain="http://www.blogger.com/atom/ns#">boxing</category><category domain="http://www.blogger.com/atom/ns#">design patterns</category><category domain="http://www.blogger.com/atom/ns#">enums</category><category domain="http://www.blogger.com/atom/ns#">generics</category><category domain="http://www.blogger.com/atom/ns#">interfaces</category><category domain="http://www.blogger.com/atom/ns#">structs</category><title>Interfacing Outside the Box</title><description>Happy New Year, everyone! Another hiatus is over and I hope to make it the last one. I&#39;ve never been the one for New Year&#39;s resolutions before, but I guess there&#39;s a first time for everything: this year I resolve to post at least once a month.&lt;br /&gt;
&lt;br /&gt;
I gave my self a head start by accumulating a number of topics I want to write about. After some careful consideration, I decided to start off with a bang: by introducing a new design pattern!&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Hacking the Language&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
A while ago, I wrote about &lt;a href=&quot;http://beardseye.blogspot.com/2007/06/dark-side-of-design-patterns.html&quot;&gt;design patterns&lt;/a&gt; and how they&#39;re often a result of having to work your way around language limitations. No language is perfect, so you&#39;re bound to run into an annoying limitation sooner or later, even if you&#39;re not spending your days coding in &lt;a href=&quot;http://www.paulgraham.com/avg.html&quot;&gt;Blub&lt;/a&gt;. If you&#39;re lucky, there&#39;s already a design pattern that allows you to hack your way around it. If not, you&#39;ll have to invent a solution yourself.&lt;br /&gt;
&lt;br /&gt;
One of such limitations in C# is the boxing that occurs when you cast a value type to an interface. I&#39;ve wrote about a related topic &lt;a href=&quot;http://beardseye.blogspot.com/2007/08/nuts-enum-conundrum.html&quot;&gt;before&lt;/a&gt;, back when I was just learning about value types and boxing in C#. You could probably tell, because I made a spectacular mistake in my speculations.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Safe Deposit Box&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
In the course of devising a solution for the problem described in Enum Conundrum, I erroneously suggested that a possible solution might have been to have my enum implement the &lt;span class=&quot;code&quot;&gt;IEquatable&lt;/span&gt; interface, had the language allowed it. That, of course, wouldn&#39;t have helped. The enum would still have been boxed, although for a different reason.&lt;br /&gt;
&lt;br /&gt;
Why do value types get boxed when cast to an interface? The reason is simple enough: for safety. For example, if you have a local value type variable, it resides on the stack and gets destroyed after the call ends. Without boxing, if you stored an interface reference to that value type instance in an object field, that field would wind up with an invalid reference as soon as the original call is done and the variable goes out of scope.&lt;br /&gt;
&lt;br /&gt;
Of course, you might not care about boxing. After all, boxing was invented precisely so that people could use value types and reference types on equal footing: the alternative to it is all that nasty &lt;a href=&quot;http://en.wikipedia.org/wiki/Primitive_wrapper_class&quot;&gt;wrapping&lt;/a&gt; code that was so widespread in Java before it incorporated boxing. So boxing is neat, isn&#39;t it?&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Get Back in Line&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
When you define a value type, you (should) do it for a good reason. Most of the time, it&#39;s because you need a type with &lt;a href=&quot;http://msdn.microsoft.com/en-us/library/aa664472%28VS.71%29.aspx&quot;&gt;value semantics&lt;/a&gt;; in other words, you want a type that behaves in such a way that it&#39;s not possible to affect one variable of that type by performing operations on another variable of the same type.&lt;br /&gt;
&lt;br /&gt;
The way value semantics is enforced is through memory allocation. Value types are allocated in-line. This means that the memory allocated for a value type instance is a part of the memory allocated for whatever contains that instance. If it&#39;s a local variable, the instance is allocated directly on the stack. If it&#39;s a field, the instance is contained in the memory allocated for the field&#39;s object. Only when it gets boxed does a value type wind up on the heap independently, all by itself.&lt;br /&gt;
&lt;br /&gt;
At times, this is precisely the reason why you&#39;ll choose a value type: not so much for its semantics, as for its memory allocation. For example, you might be writing a game in XNA and you don&#39;t want the garbage collector kicking in and ruining your frame rate.&lt;br /&gt;
&lt;br /&gt;
Whatever your reasons for wanting to keep them in line, the fact remains that you cannot cast your value types to interface without them getting boxed. What, then, is the alternative?&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Generic Static&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
Fun fact: a static field in a generic class is allocated separately for each &lt;a href=&quot;http://msdn.microsoft.com/en-us/library/aa479859.aspx#fundamentals_topic6&quot;&gt;closed constructed type&lt;/a&gt; of that generic class. What does this mean? Take a look at the following code:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;

namespace BeardsEye
{
    class GenericStatic&amp;lt;T&amp;gt;
    {
        public static int SomeVal;
    }

    class Program
    {
        static void Main(string[] args)
        {
               GenericStatic&amp;lt;int&amp;gt;.SomeVal = 1;
               GenericStatic&amp;lt;string&amp;gt;.SomeVal = 2;

               Console.WriteLine(GenericStatic&amp;lt;int&amp;gt;.SomeVal);
               Console.WriteLine(GenericStatic&amp;lt;string&amp;gt;.SomeVal);
        }
    }
}&lt;/pre&gt;
&lt;br /&gt;
If you run it, it will print out 1 and 2. Of course, such behavior &lt;a href=&quot;http://www.awpi.com/Combs/Shaggy/013.html&quot;&gt;stands to reason&lt;/a&gt;; if you&#39;re not sure why, just change the type of &lt;span class=&quot;code&quot;&gt;SomeVal&lt;/span&gt; to &lt;span class=&quot;code&quot;&gt;T&lt;/span&gt;.&lt;br /&gt;
&lt;br /&gt;
Why am I suddenly talking about this, anyway? Because, in some cases, you can use this trick as a substitute for what would otherwise be a dictionary. Specifically, for some uses, you could replace a &lt;span class=&quot;code&quot;&gt;Dictionary&amp;lt;Type, Something&amp;gt;&lt;/span&gt; with:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;public static class MyTypeDictionary&amp;lt;TType&amp;gt;
{
    public static Something Value;
}&lt;/pre&gt;
&lt;br /&gt;
Of course, this is a far cry from a fully functional dictionary. You don&#39;t have the Count property, there&#39;s no way to remove values, iterate over them or find out whether a type has an associated value or not. And you can&#39;t have an arbitrary number of these &quot;dictionaries&quot;.&lt;br /&gt;
&lt;br /&gt;
On the other hand, when you need to associate something with a type and that &quot;something&quot; changes from type to type, this technique is just perfect.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Fun With Functions&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
Now we finally get to the solution itself. Suppose you have an interface &lt;span class=&quot;code&quot;&gt;IConfusticatable&lt;/span&gt;, for all types that can be confusticated to a certain degree:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;public interface IConfusticatable
{
    double Confusticate(int degree);
}&lt;/pre&gt;
&lt;br /&gt;
You want to be able to confusticate classes that implement this interface, but you would also like to confusticate value types, without boxing them. You could use the following trick:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;public static class Confusticator&amp;lt;T&amp;gt;
{
    public static Func&amp;lt;T, int, double&amp;gt; Confusticate = ((what, degree) =&amp;gt; ((IConfusticatable) what).Confusticate(degree));
}&lt;/pre&gt;
&lt;br /&gt;
What do you have to do in your value type? Not much:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;public struct MyStruct
{
    static MyStruct()
    {
        Confusticator&amp;lt;MyStruct&amp;gt;.Confusticate = ((me, degree) =&amp;gt; me.Confusticate(degree));
    }

    public MyStruct(double val)
    {
        Val = val;
    }

    public double Val;

    public double Confusticate(int degree)
    {
        return Val + degree;
    }
}&lt;/pre&gt;
&lt;br /&gt;
How do you use this when you want to confusticate something? Like this:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;MyStruct myVar = new MyStruct(17);
Console.WriteLine(Confusticator&amp;lt;MyStruct&amp;gt;.Confusticate(myVar, 3));&lt;/pre&gt;
&lt;br /&gt;
I&#39;m sure that, at this point, you&#39;re thinking &quot;So what good is this? If I already know that &lt;span class=&quot;code&quot;&gt;myVar&lt;/span&gt; is a &lt;span class=&quot;code&quot;&gt;MyStruct&lt;/span&gt;, I can just call its own &lt;span class=&quot;code&quot;&gt;Confusticate&lt;/span&gt; method directly.&quot;&lt;br /&gt;
&lt;br /&gt;
This, of course, is perfectly true, unless you&#39;re writing generic code. In that case you don&#39;t know beforehand what your type parameter will be.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Warning: May Contain .NUTS&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
A few words of caution here. The proposed implementation of &lt;span class=&quot;code&quot;&gt;MyStruct&lt;/span&gt; relies on a static constructor, which is supposed to &quot;register&quot; the confustication lambda with the &lt;span class=&quot;code&quot;&gt;Confusticator&lt;/span&gt;. The problem with this is that it&#39;s a bit tricky to make sure that a value type static constructor is invoked.&lt;br /&gt;
&lt;br /&gt;
If you dig around the C# Annotated Standard, you&#39;ll find some fascinating incompatibilities between different standards and their respective implementations.&lt;br /&gt;
&lt;br /&gt;
According to the C# Standard, a static constructor for a value type is executed only when the first of the following occurs:&lt;br /&gt;
&lt;ul&gt;
&lt;li&gt;An instance member of the struct is referenced.&lt;/li&gt;
&lt;li&gt;A static member of the struct is referenced.&lt;/li&gt;
&lt;li&gt;An explicitly declared constructor of the struct is called.&lt;/li&gt;
&lt;/ul&gt;
On the other hand, the CLI Standard requires execution only when the first of the following occurs:&lt;br /&gt;
&lt;ul&gt;
&lt;li&gt;A static member of the struct is referenced.&lt;/li&gt;
&lt;li&gt;An explicitly declared constructor of the struct is called.&lt;/li&gt;
&lt;/ul&gt;
To further complicate the matters, the CLI Standard does &lt;span style=&quot;font-style: italic;&quot;&gt;not&lt;/span&gt; allow execution when an instance member of the struct is referenced. You&#39;ll note that this is in direct conflict with the C# Standard.&lt;br /&gt;
&lt;br /&gt;
If you actually test what happens when you run a C# program on CLR, you&#39;ll see that the static constructor is called as soon as the first of the following occurs:&lt;br /&gt;
&lt;ul&gt;
&lt;li&gt;An instance &lt;span style=&quot;font-style: italic;&quot;&gt;method&lt;/span&gt; of the struct is referenced.&lt;/li&gt;
&lt;li&gt;An instance &lt;span style=&quot;font-style: italic;&quot;&gt;property&lt;/span&gt; of the struct is referenced.&lt;/li&gt;
&lt;li&gt;A static member of the struct is referenced.&lt;/li&gt;
&lt;li&gt;An explicitly declared constructor of the struct is called.&lt;/li&gt;
&lt;/ul&gt;
However, the static constructor is &lt;span style=&quot;font-style: italic;&quot;&gt;not&lt;/span&gt; called if you reference an instance &lt;span style=&quot;font-style: italic;&quot;&gt;field&lt;/span&gt; of the struct. Not only does the CLR implementation fail to conform to either standard, it does so in a way that seems spectacularly arbitrary. I&#39;m sure there are some perfectly valid -- if arcane and obscure -- reasons for this.&lt;br /&gt;
&lt;br /&gt;
All in all, it would be a lot more reliable to stick the &quot;registration&quot; code in some initialization method that you know will be called. That&#39;s what you would have to do anyway, if you wanted to register a confustication function for an enum, for example.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Finishing Touches&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
What if your &lt;span class=&quot;code&quot;&gt;IConfusticatable&lt;/span&gt; interface has more than one method? There are several possible solutions, but the simplest one would be to give your &lt;span class=&quot;code&quot;&gt;Confusticator&amp;lt;T&amp;gt;&lt;/span&gt; one static delegate field per &lt;span class=&quot;code&quot;&gt;IConfusticatable&lt;/span&gt; method.&lt;br /&gt;
&lt;br /&gt;
Speaking of &lt;span class=&quot;code&quot;&gt;Confusticator&amp;lt;T&amp;gt;&lt;/span&gt;, if you find it too ugly to write &lt;span class=&quot;code&quot;&gt;Confusticator&amp;lt;MyStruct&amp;gt;.Confusticate(myVar, 3)&lt;/span&gt;, you can use type inference to introduce some syntactic sugar:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;public static class Confusticator&amp;lt;T&amp;gt;
{
    public static Func&amp;lt;T, int, double&amp;gt; Confusticate = ((what, degree) =&amp;gt; ((IConfusticatable) what).Confusticate(degree));
}

public static class Confusticator
{
    public static double Confusticate&amp;lt;T&amp;gt;(T what, int degree)
    {
        return Confusticator&amp;lt;T&amp;gt;.Confusticate(what, degree);
    }
}&lt;/pre&gt;
&lt;br /&gt;
With that, you would be able to simply write &lt;span class=&quot;code&quot;&gt;Confusticator.Confusticate(myVar, 3)&lt;/span&gt; and have the compiler infer the type automagically.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Share and Enjoy&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
Well, that&#39;s all for now. For those of you who have a legitimate need to avoid boxing your value types, I hope this proves useful. For the rest, I hope that you found this an interesting trick.&lt;br /&gt;
&lt;br /&gt;
I&#39;m planning to write more recipes for getting stuff that C# doesn&#39;t have, such as multiple dispatch, so stay tuned!</description><link>http://beardseye.blogspot.com/2009/01/interfacing-outside-box.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>0</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-7966663432663317570</guid><pubDate>Thu, 24 Jul 2008 20:40:00 +0000</pubDate><atom:updated>2008-07-24T16:41:47.639-04:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">games</category><category domain="http://www.blogger.com/atom/ns#">hype</category><category domain="http://www.blogger.com/atom/ns#">piracy</category><category domain="http://www.blogger.com/atom/ns#">tpm</category><category domain="http://www.blogger.com/atom/ns#">trusted computing</category><title>&quot;Encryption Chip&quot; Will Not End Piracy</title><description>&lt;span style=&quot;font-weight: bold;&quot;&gt;Nolan Bushnell is full of it&lt;/span&gt;. There, I finally got that off my chest. It&#39;s arguably acerbic and rather rude, but it needed to be said. You have no idea how hard I&#39;ve tried to avoid saying it. After all, he&#39;s the founder of Atari and a historical figure unto his own. He deserves certain respect for that.&lt;br /&gt;&lt;br /&gt;The first time Nolan Bushnell claimed that the &lt;a aiotitle=&quot;&amp;quot;encryption chip&amp;quot; will end piracy&quot; href=&quot;http://www.gamesindustry.biz/articles/encryption-chip-will-end-piracy-open-markets-says-bushnell&quot;&gt;&quot;encryption chip&quot; will end piracy&lt;/a&gt;, I exercised due restraint. His statement reverberated all over the Internet, causing reactions that ranged from &lt;a href=&quot;http://news.bigdownload.com/2008/05/24/bushnell-foolproof-final-solution-to-game-piracy-imminent/&quot;&gt;mild skepticism&lt;/a&gt; on one end of the spectrum to &lt;a href=&quot;http://www.schneier.com/blog/archives/2008/05/tpm_to_end_pira.html&quot;&gt;derision&lt;/a&gt; and &lt;a aiotitle=&quot;disgust&quot; href=&quot;http://games.slashdot.org/article.pl?sid=08/05/26/153222&quot;&gt;disgust&lt;/a&gt; on the other.&lt;br /&gt;&lt;br /&gt;So why am I writing now, more than two months later, if nobody believed him in the first place? In other words, why am I beating a dead horse? Partly, it&#39;s because &lt;a href=&quot;http://www.gamasutra.com/view/feature/3717/nolan_bushnell_what_the_game_.php?page=4&quot;&gt;he did it again&lt;/a&gt; and it pisses me off. Mostly, though, it&#39;s because I&#39;m rather interested in copy protections and security; it&#39;s sort of a hobby of mine.&lt;br /&gt;&lt;br /&gt;The most important lessons you learn in those two fields is that &lt;a href=&quot;http://www.guardian.co.uk/technology/2007/sep/04/lightspeed&quot;&gt;no protection is perfect&lt;/a&gt; and every solution spawns a new class of problems. This means that there will never be one final (technical) solution to the issue of piracy; there is no silver bullet. The experts from both fields are locked in an arms race with their adversaries. Once you&#39;ve learned that, you&#39;ll have no problem recognizing that Nolan Bushnell is really just &lt;a href=&quot;http://digitaldaily.allthingsd.com/20080526/bushnell/&quot;&gt;flogging his merch&lt;/a&gt;.&lt;br /&gt;&lt;br /&gt;However, the issue runs deeper than that.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Copy Protection and Security&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;When I referred to copy protections and security I said &quot;two fields&quot;, even though one can be considered a subset of the other; after all, copy protections are supposed to prevent the unauthorized use of software. Even though this is technically true, there are some drastic differences between the two.&lt;br /&gt;&lt;br /&gt;An important difference is the level of cooperation from the users. When it comes to information security, the users actively cooperate with the protection systems, because it&#39;s in their best interest. You don&#39;t give access to your bank account to all your friends, do you?&lt;br /&gt;&lt;br /&gt;On the other hand, copy protections often clash with the users&#39; interests. Some of these interests are illegal, such as downloading a commercial game for free. But other interests are quite legal and legitimate. You added more memory to your computer? Odds are you might have to reactivate your Windows.&lt;br /&gt;&lt;br /&gt;Another important difference is that a copy protection has to protect an application that lives on the user&#39;s computer. Unless we&#39;re talking about an MMOG, there&#39;s no server counterpart that executes a critical piece of code, without which the game can&#39;t work.&lt;br /&gt;&lt;br /&gt;When you put those two things together, it becomes obvious why you can&#39;t make a perfect copy protection: you&#39;re relying on cooperation from a user that has complete control over his copy of your content or software. If that user doesn&#39;t want to cooperate, the best you can do is delay him. Even unbreakable ciphers won&#39;t help you, because sooner or later you&#39;ll have to decrypt the content and, when you do, the user will nab it.&lt;br /&gt;&lt;br /&gt;But what if you could alter these conditions? You could make sure that there&#39;s a critical part of an application that executes somewhere where the user doesn&#39;t have control over it: that&#39;s what MMOGs do. The other option is to take away the control from the user.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Trust Controversy&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Enter the &quot;trusted computing&quot;. The first time I heard of it was back when Microsoft was touting &lt;a href=&quot;http://en.wikipedia.org/wiki/Next-Generation_Secure_Computing_Base&quot;&gt;Palladium&lt;/a&gt;. Back then, it sounded like a bad pun: a company found guilty in an antitrust lawsuit is proposing to build a &quot;trusted computing platform&quot; for its users. The irony was &lt;a href=&quot;http://news.zdnet.com/2100-9595_22-939817.html&quot;&gt;not lost&lt;/a&gt; on anyone and it provoked some &lt;a href=&quot;http://www.schneier.com/crypto-gram-0208.html#1&quot;&gt;enlightening responses&lt;/a&gt; from security experts.&lt;br /&gt;&lt;br /&gt;Then, since nothing really seemed to happen and we didn&#39;t all suddenly wake up in some digital equivalent of &lt;a href=&quot;http://www.amazon.com/gp/product/0451524934?ie=UTF8&amp;amp;tag=besey-20&amp;amp;linkCode=as2&amp;amp;camp=1789&amp;amp;creative=9325&amp;amp;creativeASIN=0451524934%22%3E1984%3C/a%3E%3Cimg%20src=%22http://www.assoc-amazon.com/e/ir?t=besey-20&amp;amp;l=as2&amp;amp;o=1&amp;amp;a=0451524934&quot;&gt;1984&lt;/a&gt;, I lost track of this topic for a while. I forgot about it until Nolan Bushnell started his TPM hype. A quick search engine query revealed that TPM stands for &quot;Trusted Platform Module&quot; and that it&#39;s the central component of &quot;trusted computing&quot;.&lt;br /&gt;&lt;br /&gt;What, then, is the so-called &quot;trusted computing&quot;? It&#39;s a technology that encompasses the following concepts:&lt;br /&gt;&lt;ol&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Endorsement key&lt;/span&gt; is a cryptographic key pair unique to one computer. The chief use for it is to prove the computer&#39;s identity.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Secure I/O&lt;/span&gt; makes sure that the communication between the user and their software is secure and cannot be intercepted or altered.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Memory curtaining&lt;/span&gt; protects those parts of memory that contain sensitive data (such as cryptographic keys) from unauthorized access, even by the operating system itself.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Sealed storage&lt;/span&gt; binds data to the specific platform -- both hardware and software -- so that you cannot access it from any other platform.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Remote attestation&lt;/span&gt; allows authorized parties to detect changes to the platform configuration in order to make sure that they meet the expected parameters; in other words, to prove that nobody tampered with the platform.&lt;br /&gt;&lt;/li&gt;&lt;/ol&gt;That&#39;s just a brief summary, to give you an idea of what we&#39;re talking about here. If you want more information, I recommend that you start at &lt;a href=&quot;http://en.wikipedia.org/wiki/Trusted_Computing&quot;&gt;Wikipedia&lt;/a&gt; and then go on directly to the Trusted Computing Group &lt;a href=&quot;https://www.trustedcomputinggroup.org/&quot;&gt;site&lt;/a&gt;.&lt;br /&gt;&lt;br /&gt;So, the core idea is to make computers more secure, by ensuring that no &quot;untrusted&quot; code has access to your stuff. At least, that&#39;s &lt;span style=&quot;font-style: italic;&quot;&gt;supposed&lt;/span&gt; to be the core idea. Unfortunately, there has been a great deal of confusion about the word &quot;trust&quot; in &quot;trusted computing&quot;. Specifically, who is supposed to trust whom?&lt;br /&gt;&lt;br /&gt;If you read Bruce Schneier&#39;s &lt;a href=&quot;http://www.schneier.com/blog/archives/2005/08/trusted_computi.html&quot;&gt;essay&lt;/a&gt; on &quot;trusted computing&quot;, you&#39;ll notice that there&#39;s a good deal of controversy and confusion surrounding the issue. As one commenter so aptly put it, the only one &lt;span style=&quot;font-style: italic;&quot;&gt;not&lt;/span&gt; trusted seems to be the owner of the computer.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;All Your Base&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Each of the five concepts of &quot;trusted computing&quot; addresses a real security problem:&lt;br /&gt;&lt;ol&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Endorsement keys&lt;/span&gt; would be used to mitigate spoofing concerns in secure transactions by establishing the identity of each party involved.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Secure I/O&lt;/span&gt; is supposed to avoid security breaches through techniques such as keylogging.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Memory curtaining&lt;/span&gt; would make sure that sensitive information, such as cryptographic keys, is not allowed to &quot;leak&quot; somewhere where it could be extracted by malicious parties.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Sealed storage&lt;/span&gt; would do a similar thing for sensitive information in non-volatile storage.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Remote attestation&lt;/span&gt; could help network administrators easily detect intrusions and attacks on their machines.&lt;/li&gt;&lt;/ol&gt;Yet, after a closer look at them, it becomes evident that there&#39;s plenty of room for abuse. Imagine, for example, a system that enforces specific usage policies on your data:&lt;br /&gt;&lt;ul&gt;&lt;li&gt;It would use &lt;span style=&quot;font-weight: bold;&quot;&gt;sealed storage&lt;/span&gt; to bind that data to a particular application or set of applications that you&#39;re allowed to use on that data.&lt;/li&gt;&lt;li&gt;It would employ &lt;span style=&quot;font-weight: bold;&quot;&gt;memory curtaining&lt;/span&gt; to make sure you cannot extract that data directly from memory.&lt;/li&gt;&lt;li&gt;It would use &lt;span style=&quot;font-weight: bold;&quot;&gt;secure I/O&lt;/span&gt; to make sure you cannot intercept it on its way somewhere else.&lt;/li&gt;&lt;li&gt;It would use &lt;span style=&quot;font-weight: bold;&quot;&gt;remote attestation&lt;/span&gt; to report if you tamper with any part of the system.&lt;/li&gt;&lt;li&gt;And it would clearly identify you as a &quot;culprit&quot; to whoever is interested in enforcing those policies, if it possessed both your personal information and your &lt;span style=&quot;font-weight: bold;&quot;&gt;endorsement key&lt;/span&gt;.&lt;/li&gt;&lt;/ul&gt;Is there any kind of usage policy that springs immediately to mind? There are two, actually: DRM and vendor lock-in. Ross Anderson describes several ways to abuse TC in his &lt;a href=&quot;http://www.cl.cam.ac.uk/%7Erja14/tcpa-faq.html&quot;&gt;FAQ&lt;/a&gt;. Richard Stallman dedicates a whole &lt;a href=&quot;http://www.gnu.org/philosophy/can-you-trust.html&quot;&gt;chapter&lt;/a&gt; to this topic, in his book &quot;Free Software, Free Society&quot;; although slightly reminiscent of Book of Revelations in tone, it offers some very interesting insights.&lt;br /&gt;&lt;br /&gt;Another interesting aspect of &quot;trusted computing&quot; is that it actually raises the stakes when it comes to information security: imagine a worm that successfully exploits a bug in the supposedly secure OS code to install a &quot;trusted&quot; rootkit? Talk about irony.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Pirates vs. Ninjas&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Getting back to the original topic, does this mean that Nolan Bushnell is right? Is his &quot;stealth encryption chip&quot; really going to send all the pirates to the Davy Jones&#39;s Locker? Not by a long shot! Remember, if the software in question does not have some critical code running on some computer under control of some &quot;authority&quot;, you can eventually break its copy protection.&lt;br /&gt;&lt;br /&gt;When it comes to policy enforcement, the most important part of the &quot;trusted computing&quot; is the &lt;span style=&quot;font-weight: bold;&quot;&gt;remote attestation&lt;/span&gt;. This is the way to ensure you won&#39;t tamper with the policy enforcement code. Incidentally, it requires you to be online. Now back up a couple of months and remember what happened when BioWare &lt;a href=&quot;http://blog.wired.com/games/2008/05/mass-effect-pc.html&quot;&gt;tried to pull that trick&lt;/a&gt; on its players.&lt;br /&gt;&lt;br /&gt;If you believe that pirates can&#39;t hide behind this forever, think again. There are numerous valid reasons to resist the attempts to introduce an &lt;span style=&quot;font-style: italic;&quot;&gt;artificial&lt;/span&gt; dependency on Internet connection into software and all those reasons boil down to one: the connection is not always available, yet the artificial nature of the dependency means that the software doesn&#39;t actually &lt;span style=&quot;font-style: italic;&quot;&gt;need&lt;/span&gt; it to work properly.&lt;br /&gt;&lt;br /&gt;Besides, people are &lt;a href=&quot;http://doi.ieeecomputersociety.org/10.1109/MSP.2005.40&quot;&gt;not yet convinced&lt;/a&gt; that &quot;trusted computing&quot; will actually make things better. Plus, there are all sorts of concerns about privacy and also about practicality of the whole approach. Still, the Trusted Computing Group has been formed, the commercial motivation is there and &quot;trusted computing&quot; will keep rolling, until all doubts and concerns have been dealt with, one way or another.&lt;br /&gt;&lt;br /&gt;What, then, is the worst blow &quot;trusted computing&quot; could deal to pirates? Indulge me a bit, as I let my imagination run wild and explore a &quot;what-if&quot; future.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Crack Dealers&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Back when I was a little kid, learning what makes the cute, little Spectrum 48 tick, pirates were selling games on audio tapes. Today, pirated games are free. They are cracked for free, by enthusiasts; they are uploaded for free, on sites that survive on advertising or donations; and they are downloaded for free. I can still see some pirates in the streets, selling CDs and DVDs, but I&#39;m sure they won&#39;t be buying any Ferraris with that money.&lt;br /&gt;&lt;br /&gt;Fast forward to the time when &quot;trusted computing&quot; is in its full swing. To crack protections, pirates need very specialized software, maybe even some hardware, and a lot more effort than before. More than ever, piracy is something that only the select few can do.&lt;br /&gt;&lt;br /&gt;However, it is also more lucrative than ever. As the usage policies are enforced more rigorously, the multitudes who used to obtain their entertainment for free now have to go and buy it. The big companies take advantage of that and the prices are even higher than before. You can buy an overpriced game directly from its publisher; or you can take a chance and go buy yourself a pirated copy from a local &quot;software crack dealer&quot;. It&#39;s illegal, sure, but it&#39;s a lot cheaper and you can afford to buy a lot more.&lt;br /&gt;&lt;br /&gt;Suddenly, pirates are not your everyday enthusiasts anymore; instead, they&#39;re rich criminals. They have bodyguards with guns. They have shady lawyers. They have money laundering enterprises and fake fronts and lots of connections. They know powerful people. In your efforts to eradicate a problem, you managed to make it &lt;a href=&quot;http://www.schneier.com/blog/archives/2005/12/car_thieves_ada.html&quot;&gt;mutate into something worse&lt;/a&gt;.&lt;br /&gt;&lt;br /&gt;If this seems improbable and exaggerated, that&#39;s okay: I don&#39;t believe it&#39;s likely to happen. My point is that you should always be on the lookout for &lt;a href=&quot;http://www.econlib.org/library/Enc/UnintendedConsequences.html&quot;&gt;unintended consequences&lt;/a&gt;. It would be nice if, just for once, we asked ourselves where we&#39;re going before we get there.</description><link>http://beardseye.blogspot.com/2008/07/encryption-chip-will-not-end-piracy.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>3</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-8302518161335757990</guid><pubDate>Sun, 20 Jul 2008 23:13:00 +0000</pubDate><atom:updated>2008-07-20T19:24:14.247-04:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">firefox</category><category domain="http://www.blogger.com/atom/ns#">mozilla</category><title>Update: Firefox 3 Works After All</title><description>Consider this a deep breath before I plunge into some more serious blogging. It turns out that Firefox 3 works rather well, after all. Not that I doubted it, but I couldn&#39;t really try it out until all my favorite add-ons were updated and relatively stable.&lt;br /&gt;&lt;br /&gt;The biggest difference I&#39;ve noticed so far is that it doesn&#39;t do its famous hog-the-CPU trick on pages such as Google Spreadsheets. Now &lt;span style=&quot;font-weight: bold;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;that&lt;/span&gt;&lt;/span&gt;&lt;span&gt;&lt;span&gt; is a big relief for me; I absolutely hated that particular bug.&lt;br /&gt;&lt;br /&gt;Make no mistake: I didn&#39;t change my mind about my first experience with Firefox 3. I still believe the developers should have allowed their users to do one of the following:&lt;br /&gt;&lt;/span&gt;&lt;/span&gt;&lt;ol&gt;&lt;li&gt;check the add-on compatibility during the installation&lt;/li&gt;&lt;li&gt;revert to Firefox 2&lt;br /&gt;&lt;/li&gt;&lt;li&gt;run Firefox 2 and Firefox 3 side by side&lt;/li&gt;&lt;/ol&gt;But, hey, at least it works now. And I&#39;ve only confirmed that I&#39;m a late adopter by nature. As if I really needed to confirm that: I&#39;ve lived in Chile for nine years before I decided to try &lt;a href=&quot;http://en.wikipedia.org/wiki/Carmenere&quot;&gt;Carménère&lt;/a&gt;.</description><link>http://beardseye.blogspot.com/2008/07/update-firefox-3-works-after-all.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>1</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-2814869412449598712</guid><pubDate>Wed, 18 Jun 2008 01:46:00 +0000</pubDate><atom:updated>2008-11-13T13:09:45.537-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">firefox</category><category domain="http://www.blogger.com/atom/ns#">hype</category><category domain="http://www.blogger.com/atom/ns#">mozilla</category><title>Firefox 3: A Five Act Tragedy</title><description>&lt;span style=&quot;font-weight: bold;&quot;&gt;Act I: The Hype&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;I first heard of the Firefox Download Day at the office, from a co-worker. It wasn&#39;t a dark and stormy night, however much we all wished it to be -- the A/C was on the blink yet again.&lt;br /&gt;&lt;br /&gt;Naturally, my curiosity was piqued. I&#39;ve been using Firefox for years now and I&#39;ve grown used to it and a bunch of add-ons for it. Sure, it had a couple of annoying bugs, but I could live with those. Of course, a new and improved version was likely to get rid of them, which would have been incentive enough to check it out.&lt;br /&gt;&lt;br /&gt;So I went to the &quot;Spread Firefox!&quot; site to find out what the fuss was all about. When I found that Mozilla not only planned to give us all a new version of my browser of choice, but also establish a Guiness World Record, I was hooked. I pledged right away and also scheduled a reminder in my Google Calendar, to make sure I wouldn&#39;t forget.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Act II: The Outage&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;It seems childish in retrospect, but I got really excited. Not only was I going to get my hands on the hottest piece of software on the Web, but I was also going to be a part of the effort to set a Guiness World Record. And I wasn&#39;t the only one around: a few other co-workers were as excited about this as I was.&lt;br /&gt;&lt;br /&gt;When the big day finally came, I made sure I was busy working to keep my mind off the wait. The time passed quickly enough and when the clock was finally done ticking its way to 1 pm, I went back to the Firefox site. That&#39;s when I got my first surprise: the site was down.&lt;br /&gt;&lt;br /&gt;&lt;div style=&quot;text-align: left;&quot;&gt;&lt;a onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhJwnM0i9K6K5ri_UjU7rhOgPX1vnNSllOMUo39NpzeOkPObNoKNahyphenhyphen9FA2Fg8FwD8Ca5eHXOlfz0XrQL8dm0tLG7C6nUCexw3Kx4TSSEQA8KcrGPZA2d_bLUoXdrUGII9Q236UkzCtMiY/s1600-h/Gogogo.jpg&quot;&gt;&lt;img style=&quot;margin: 0pt 10px 10px 0pt; float: right; cursor: pointer; width: 190px; height: 142px;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhJwnM0i9K6K5ri_UjU7rhOgPX1vnNSllOMUo39NpzeOkPObNoKNahyphenhyphen9FA2Fg8FwD8Ca5eHXOlfz0XrQL8dm0tLG7C6nUCexw3Kx4TSSEQA8KcrGPZA2d_bLUoXdrUGII9Q236UkzCtMiY/s320/Gogogo.jpg&quot; alt=&quot;&quot; id=&quot;BLOGGER_PHOTO_ID_5213052578312499234&quot; border=&quot;0&quot; /&gt;&lt;/a&gt;It&#39;s not unusual for a site to be brought down to its knees by &lt;a href=&quot;http://blog.mozilla.com/blog/2008/06/17/firefox-3-coming-soon/&quot;&gt;overwhelming interest&lt;/a&gt;, but I would have expected Mozilla to be prepared. After all, they were the ones who did their best to attract all that attention. Why make a fuss if you can&#39;t cope with the results?&lt;br /&gt;&lt;/div&gt;&lt;br /&gt;After a couple of hours of doing a Wile E. Coyote impression, Mozilla finally got its stuff together and I was finally able to download my copy. Once it was on my disk, I wasted no more time and clicked through the install impatiently. At last, the moment of truth was at hand: I launched Firefox 3!&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Act III: The Incompatibility&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;The first thing my new Firefox did was to check my add-ons. It then proceeded to notify me that less than half of them had updates compatible with version 3 of Firefox. That left more than half of my add-ons disabled.&lt;br /&gt;&lt;br /&gt;Being a coder means that an essential part of your job is to imagine worst cases for your code, so&lt;br /&gt;I generally try to be an optimist about everything else, just to balance things out. Therefore I surmised that the Mozilla site was still experiencing difficulties. Did I mention that my optimism is rarely founded?&lt;br /&gt;&lt;br /&gt;I spent some time fumbling with the new user interface, only to find out that the developers didn&#39;t stop at adding the add-on search into Firefox itself. They also decided to remove the link that took the users directly to the add-on page. I ended up checking manually and the add-on page was working quite well, thank you.&lt;br /&gt;&lt;br /&gt;That&#39;s where I felt Mozilla managed to set a record, although it wasn&#39;t the one they were hoping for: I went from enthusiastic to disappointed in less than 5 minutes since I installed their software. What were they thinking?&lt;br /&gt;&lt;br /&gt;Pretend you&#39;re in charge of developing a rather successful software product. One of the key advantages it has over the competition is its extensibility. There&#39;s a huge variety of add-ons for it and they do lots of different things, ranging from silly to extremely useful. Even though your users might be dissatisfied with some aspects of your software, they would still be reluctant to change to a product that doesn&#39;t have the add-ons they use. So what do you do when you decide to develop a shiny new version of your product?&lt;br /&gt;&lt;br /&gt;The fashionable answer seems to be &quot;break the add-ons that worked nicely with the older versions&quot;, even though the &lt;a href=&quot;http://www.joelonsoftware.com/articles/APIWar.html&quot;&gt;history teaches us the opposite&lt;/a&gt;.&lt;br /&gt;&lt;br /&gt;Okay, I understand that you can&#39;t always maintain compatibility. Sometimes, preserving compatibility is in direct conflict with a higher-priority design goal and you just have to go ahead and break stuff. What you do in those situations is to try to minimize the damage, which is what Mozilla &lt;a href=&quot;http://alex.polvi.net/2008/06/05/state-of-the-add-ons-report-june-5th/&quot;&gt;tried to do&lt;/a&gt;. But what do you do about the stuff the remains broken?&lt;br /&gt;&lt;br /&gt;Here&#39;s what you do: you make your installer check the compatibility and report to user &lt;span style=&quot;font-weight: bold;&quot;&gt;before &lt;/span&gt;installing the new version and breaking your user&#39;s stuff.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Act IV: The Retreat&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Once again life conspired to remind me why I&#39;m a late adopter. After a minute or two of staring at the list of things I wouldn&#39;t be able to use with Firefox 3, I decided to go back to version 2. A routine check of my download folder revealed that I apparently got rid of the installer at some point in time. No big deal, I can download it from the Firefox site, right?&lt;br /&gt;&lt;br /&gt;Wrong! There is no link to Firefox 2 anywhere on the whole page. No matter how hard you try, you won&#39;t be offered a chance to turn back. And I tried very hard, trust me. Among other things, I tried typing &quot;firefox 2&quot; into the search box. That particular attempt rewarded me with zero results. I didn&#39;t even get a link to a press release or release notes for Firefox 2. Zilch. Firefox 2 has been exterminated.&lt;br /&gt;&lt;br /&gt;Despite all its pictures of birds and balloons and a smiling Sun, the message Mozilla is sending its users is loud and clear: &lt;span style=&quot;font-weight: bold; font-style: italic;&quot;&gt;&quot;Assume the position!&quot;&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;In the end, I found the download link on &lt;a href=&quot;http://www.firefox2.com/&quot;&gt;www.firefox2.com&lt;/a&gt;, a site that features a prominent notice that it isn&#39;t associated with Mozilla in any way. It worked for me, but it probably won&#39;t work for you. As I&#39;m writing, I&#39;m also looking at the new content of the site, where every occurence of &quot;Firefox 2&quot; has been replaced with &quot;Firefox 3&quot;, except the site URL, of course.&lt;br /&gt;&lt;br /&gt;Just in case you really need to go back, the solution is to grab the Firefox 3 download link and change it manually to request the version 2.0.0.14. For example, the correct link to download the English (US) version of Firefox 2.0.0.14 for Windows is:&lt;br /&gt;&lt;pre class=&quot;listing&quot;&gt;&lt;a href=&quot;http://download.mozilla.org/?product=firefox-2.0.0.14&amp;amp;os=win&amp;amp;lang=en-US&quot;&gt;http://download.mozilla.org/?product=firefox-2.0.0.14&amp;amp;os=win&amp;amp;lang=en-US&lt;/a&gt;&lt;/pre&gt;&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Act V: The Repose&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Having successfully overcome the unexpected difficulties, I decided it was time to wrap things up and go back to work. I installed Firefox 2 over its younger sibling and launched it. My optimism was proven unfounded yet again: Firefox neatly crashed when it tried to load my saved session.&lt;br /&gt;&lt;br /&gt;I tried a few things before I stopped fighting the inevitable and uninstalled Firefox. Hoping fervently that the problem would be solved this time, I reinstalled it and launched it.&lt;br /&gt;&lt;br /&gt;By this time I was more than a mite miffed or a tad testy: I was positively pissed off. In the spirit of their decision that I won&#39;t need version 2 ever again, Mozilla obviously didn&#39;t bother to test the downgrade process.&lt;br /&gt;&lt;br /&gt;Fortunately, it turned out that no drastic measures were necessary. I was back to Firefox 2 and all my add-ons worked again.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Epilogue&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;While writing this post, I stumbled upon this article that explains &lt;a href=&quot;http://www.networkworld.com/news/2008/051908-make-older-add-ons-work-with.html?ts0hb=&amp;amp;story=ts_ff&quot;&gt;how to make older add-ons work with new Firefox&lt;/a&gt;. In short, it describes a quick hack that makes Firefox skip the step where it checks for add-on compatibility. The article was about version 3.0rc1, so I don&#39;t know whether this will work in the final release. I didn&#39;t try it and I don&#39;t intend to. I prefer to wait until all the add-ons I use are updated. Having a new browser that&#39;s shiny and sleek is cool, but being able to use it my way is way cooler.</description><link>http://beardseye.blogspot.com/2008/06/firefox-3-five-act-tragedy.html</link><author>noreply@blogger.com (Anonymous)</author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhJwnM0i9K6K5ri_UjU7rhOgPX1vnNSllOMUo39NpzeOkPObNoKNahyphenhyphen9FA2Fg8FwD8Ca5eHXOlfz0XrQL8dm0tLG7C6nUCexw3Kx4TSSEQA8KcrGPZA2d_bLUoXdrUGII9Q236UkzCtMiY/s72-c/Gogogo.jpg" height="72" width="72"/><thr:total>2</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-5130708472720676614</guid><pubDate>Thu, 15 May 2008 19:28:00 +0000</pubDate><atom:updated>2008-07-26T22:49:34.793-04:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">games</category><category domain="http://www.blogger.com/atom/ns#">market</category><category domain="http://www.blogger.com/atom/ns#">paradigms</category><category domain="http://www.blogger.com/atom/ns#">piracy</category><category domain="http://www.blogger.com/atom/ns#">society</category><title>Piracy is Here to Stay</title><description>&quot;With great power comes great responsibility.&quot; Wise words, Uncle Ben, but someone at BioWare &lt;a href=&quot;http://blog.wired.com/games/2008/05/mass-effect-pc.html&quot;&gt;wasn&#39;t listening&lt;/a&gt;. For those who hate following links, the short version of the story is that BioWare was going to implement draconian anti-piracy measures in &lt;span style=&quot;font-style: italic;&quot;&gt;Mass Effect&lt;/span&gt; and &lt;span style=&quot;font-style: italic;&quot;&gt;Spore&lt;/span&gt; for PC. In the end, they &lt;a href=&quot;http://blog.wired.com/games/2008/05/ea-loosens-spor.html&quot;&gt;won&#39;t be doing it&lt;/a&gt;, but it wasn&#39;t for the lack of trying. Indeed, I tend to agree with Penny Arcade when it comes to predicting the &lt;a aiotitle=&quot;next step&quot; href=&quot;http://www.penny-arcade.com/comic/2008/5/9/&quot;&gt;next step&lt;/a&gt; in the war on piracy, a war that has a long history of going overboard.&lt;br /&gt;&lt;br /&gt;What&#39;s interesting is the widespread reaction among the gamers:&lt;br /&gt;&lt;blockquote&gt;Not only does this not make me want to buy the game, this makes me want to download a pirated and cracked copy even more&lt;/blockquote&gt;Funny, isn&#39;t it? I mean, when someone mentions using RFID for shoplifting prevention, you&#39;ll hear all sorts of privacy concerns and doubts about usefulness of such a system, but you won&#39;t hear people say they&#39;ll go shoplifting in protest.&lt;br /&gt;&lt;br /&gt;So what is it that makes us perceive and react to piracy in such a different way? Let&#39;s take a look at some of the factors involved. Be warned, though: there&#39;s no magical happy ending to this story. Whether the hero defeats the villain and gets the girl remains to be seen.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Thou Shalt Not&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Back where I originally come from, some 15 years ago, shops didn&#39;t have any shoplifting prevention mechanisms besides humans and, if the shop was large enough, mirrors. We all knew shoplifting was wrong, that it was stealing, that it was &lt;span style=&quot;font-style: italic;&quot;&gt;crime&lt;/span&gt;. We also knew that getting caught would have serious repercussions. Yet there were kids who did it. Not out of necessity, either. They did it for fun. It was challenging. It was a game.&lt;br /&gt;&lt;br /&gt;Where am I going with this? There are several points I&#39;m trying to make here. First of all, crime has a lot to do with perception. Sure, there&#39;s law, with its letter and its spirit and a bunch of lawyers and politicians to play with it. If you break the law, you&#39;re a criminal and that should be it. And yet, none of those kids saw themselves as criminals.&lt;br /&gt;&lt;br /&gt;Second, it&#39;s in people&#39;s nature to do as they wish and rules be damned. You can try to educate them, to impress upon them that such behavior is wrong, but there will still be those who will do it and for a variety of reasons, too.&lt;br /&gt;&lt;br /&gt;It might sound like I&#39;m trying to justify piracy, theft and crime in general. I&#39;m not. I&#39;m trying to point out that they will always exist and that reasons for it are not as monolithical as we would like to believe. After all, it would be a lot simpler if people committed crimes because they&#39;re evil, but the world doesn&#39;t work that way. And if we want to do something about a problem, we have to understand the cause.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Can&#39;t Touch This&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Every time I rent a DVD, I have to watch one of the MPAA&#39;s &lt;a href=&quot;http://www.youtube.com/watch?v=iPcHhOBd-hI&quot;&gt;anti-piracy videos&lt;/a&gt;. There&#39;s no way to skip it, of course, so each time I&#39;m freshly reminded of the first and foremost difference between stealing and piracy: intangibility. In order to steal a car, a handbag, a television or a DVD, just like in the video, you have to reach out and grab something tangible. There might be a lot of objects &lt;span style=&quot;font-style: italic;&quot;&gt;like&lt;/span&gt; that one, but there is only one of that particular object and it&#39;s not yours. Taking possession of it is clearly identifiable as a crime.&lt;br /&gt;&lt;br /&gt;Things are not as intuitive with music, movies and software. Sure, if someone invites me to their house to watch a DVD with them, it&#39;s not a crime. They&#39;re sharing their property with me of their own will. If they offer to give me a lift in their car, it&#39;s the same sort of thing. But you can&#39;t copy a car like you can a song, can you? You can&#39;t parody a car, either, or quote it.&lt;br /&gt;&lt;br /&gt;Intellectual property is a complicated and muddled concept and requires an equaly complicated and muddled definition of fair use. With the massification of the digital content and media, things got even more complicated and muddled, which brings us to the chaos we&#39;re living nowadays.&lt;br /&gt;&lt;br /&gt;And chaos it is. How else would you call it when in one corner you have people who demand that all the digital content be free, as in free beer, and in the other corner you have giants like RIAA &lt;a href=&quot;http://blog.wired.com/27bstroke6/riaa_trial/index.html#44321138&quot;&gt;suing random individuals&lt;/a&gt;?&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;The Age of Middlemen&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;So far we&#39;ve looked at some of the factors causing the piracy. But what about the effects of it? Organizations like MPAA and RIAA are claiming that they are losing money. Publishers like EA are &lt;a href=&quot;http://arstechnica.com/journals/thumbs.ars/2008/05/01/next-up-to-blame-piracy-for-pc-gamings-decline-peter-moore&quot;&gt;complaining&lt;/a&gt; about the same thing.&lt;br /&gt;&lt;br /&gt;Let&#39;s get one thing clear from the start: developers need game sales to survive. They develop games for living and their money comes from sales, just like your money comes from the job you&#39;re doing. There is no doubt that if you install and play their game on your computer without buying it, you&#39;re using the products of their work without them receiving your money for it.&lt;br /&gt;&lt;br /&gt;Notice, however, that I haven&#39;t mentioned publishers in previous paragraph. What does RIAA stand for? Recording Industry Association of America. The keyword here is &quot;recording&quot;. The people RIAA represent are those who make money by selling sound recordings of music. That used to be very straightforward: they used to sell &lt;span style=&quot;font-weight: bold;&quot;&gt;records&lt;/span&gt;.&lt;br /&gt;&lt;br /&gt;Vinyl records are a bitch to duplicate. As long as they were the medium for sound reproduction, the recording industry could control the music industry by controlling the medium itself. The advent of magnetic tapes brought the first crisis. The big guys freaked out, tried to defend their territory and lost. That was the major turning point in the history of copyright.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Evolution&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Notice how MPAA is making a lot less fuss than RIAA? You know why? Their industry kept evolving. Sure, nowadays you can watch DVDs on your HDTV in your home-theater, but nothing beats seeing the very first screening of the long-awaited movie in your theater of choice. Besides, everyone and their grandmother will have seen the movie before it comes out on a DVD.&lt;br /&gt;&lt;br /&gt;The first important factor here is that movie industry still has a meaningful experience to offer to their customers, beyond merely distributing the medium on which the content is stored. The second important factor is that there&#39;s ample segmentation. You can choose how much you care about a movie: will you go to a theater or buy a DVD? Or you might rent it before deciding whether to buy it. Or you might just wait for it to come out on cable. Plenty of options out there.&lt;br /&gt;&lt;br /&gt;RIAA, on the other hand, got stuck. They just sat there while their golden goose spread its wings and flied away. Well, they didn&#39;t &lt;span style=&quot;font-style: italic;&quot;&gt;just&lt;/span&gt; sit. They kicked and screamed and made a hell of a fuss. They&#39;re still doing it, in fact. But they failed to adapt and are paying the price for it.&lt;br /&gt;&lt;br /&gt;It&#39;s something I realized when I went to a Dream Theater concert in March this year. These guys didn&#39;t only play their instruments and sing their songs. No, they staged such an impressive multimedia spectacle, that I didn&#39;t even mind the fact that they didn&#39;t play most of my favorite songs. After a show like that, which in itself must have made them a pretty penny, they could&#39;ve sold me their newest album without needing the middleman at all.&lt;br /&gt;&lt;br /&gt;Or could they?&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Eye of the Beholder&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Piracy, like I said, has a lot to do with perception. You can call it culture or mindset or education, but the fact is that I don&#39;t know many people here in Chile who buy their music. If you look at it from their point of view, the reasoning is simple: why should they? It&#39;s out there on the Internet, for free. We all use MP3 players these days, so why should anyone pay for an overpriced lump of plastic that later you have to insert in a drive and rip and transfer to your player before you can listen to it comfortably?&lt;br /&gt;&lt;br /&gt;Musicians, that&#39;s why. People who make that music need the money. Then again, the record industry screwed things up on that front too. The public perceives them as people who take huge cuts of profits and exploit the authors. It doesn&#39;t make piracy right, but it makes it justifiable in people&#39;s eyes. We come again to perception.&lt;br /&gt;&lt;br /&gt;And perception is a lot bigger problem than most of us would like to admit. Could Dream Theater really have sold us their music directly, without going through the recording industry? Why would we buy it, when we&#39;re so accustomed to getting our music for free?&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Detox: Rehab or Jail?&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;A habit is formed. How do we break it? MPAA seems to think the answer lies in the propaganda. The effects remain to be seen, but I personally &lt;a href=&quot;http://blog.wired.com/27bstroke6/2008/01/a-fathers-confe.html&quot;&gt;doubt&lt;/a&gt; it will have any significant effect. RIAA, on the other hand, thinks the answer lies in the law enforcement. We&#39;ve all seen how that&#39;s working out so far and I think we can agree that the idea is laughable.&lt;br /&gt;&lt;br /&gt;What about software? Specifically, what about games? The software industry in general is not as monolithical as recording or motion picture industry. For example, applications targeted at big corporations don&#39;t have to worry about piracy too much, but their target is not as big as the home user market. Each type of software has its own worries, but games are in a particularly hairy situation.&lt;br /&gt;&lt;br /&gt;The problem with the games industry is that it resembles the recording industry. As the saying goes, the developers make the game and the publishers make the money. They&#39;re not limited to being middlemen, though. Often they also finance the development, which gives them a bit of versatility when it comes to surviving the paradigm shifts. However, the fact remains that the online distribution channels will be making the publishers&#39; role as middlemen increasingly obsolete as time passes. Of course, online distribution brings its own middlemen, as casual game developers are &lt;a href=&quot;http://www.gamasutra.com/view/feature/3611/the_casual_games_manifesto.php&quot;&gt;discovering&lt;/a&gt;.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Safe Hex: ACME Copy Protection&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;So how does the software industry fight piracy? We geeks tend to believe that everything can be solved by technology. Thus the copy protection mechanisms.&lt;br /&gt;&lt;br /&gt;Now, wiser geeks know that technology is not an answer to everything. As &lt;a href=&quot;http://en.wikipedia.org/wiki/Fravia&quot;&gt;Fravia&lt;/a&gt; used to teach, there is no copy protection that cannot be broken. The one time I thought I found an exception to that rule was when I used &lt;a href=&quot;http://www.kali.net/&quot;&gt;Kali&lt;/a&gt;. I was wrong, of course. The program itself is just a client for the centralized service and, as such, it can be copied as freely as you want. It&#39;s semantics, I know, but it&#39;s important.&lt;br /&gt;&lt;br /&gt;The reason why it&#39;s important is because that solution doesn&#39;t apply to most games. The most notable exception are the MMOs. Incidentally, it&#39;s a pretty important factor for the popularity of this genre among the developers.&lt;br /&gt;&lt;br /&gt;The rest of the genres have to choose whether to use copy protection mechanisms or not. If they do, they have to decide how strong to make it. Unfortunately, strong often means &quot;problematic for the legitimate customer&quot; along with &quot;difficult to circumvent&quot;. This is what creates reactions of outrage, such as the reactions to the abortive attempt to &quot;secure&quot; &lt;span style=&quot;font-style: italic;&quot;&gt;Mass Effect&lt;/span&gt; and &lt;span style=&quot;font-style: italic;&quot;&gt;Spore&lt;/span&gt;.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;No Silver Bullet&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Okay, so the neither the propaganda nor the law enforcement nor the technology solve our problem. What&#39;s the answer, then?&lt;br /&gt;&lt;br /&gt;Unfortunately, nothing worth doing is ever easy. First of all, there is no silver bullet. There&#39;s no magical solution to this mess.  You can bet that the real solution won&#39;t bring you quick bucks and instant happiness.&lt;br /&gt;&lt;br /&gt;I&#39;m no expert on these matters and I can&#39;t say with certainty how to deal with this. However, I have a hunch and I&#39;m willing to bet it&#39;s a good one: nurture the market.&lt;br /&gt;&lt;br /&gt;Take the situation here in Chile. For one thing, games are outlandishly expensive. It&#39;s not just the price itself, it&#39;s how that price compares to people&#39;s earnings and other products. Second, there are too few stores selling games and their selection usually leaves a lot to wish for. There are too many obsolete games and too many crap games and too few hot items. Third, the same comments apply when it comes to renting games. Still, there are people who buy games. We do it because we like those games, we appreciate the effort it took to make them and proud to own them. But there&#39;s a &lt;span style=&quot;font-weight: bold;&quot;&gt;lot&lt;/span&gt; of room for improvement.&lt;br /&gt;&lt;br /&gt;What about countries like United States? Adapt. MMOs are just one trend. &lt;span style=&quot;font-style: italic;&quot;&gt;The Sims&lt;/span&gt; games are innovative in that they introduce a social component and allow people to create whole communities around them. Episodic content is another idea that has yet to be fully explored. Who knows what interesting new idea will come along next?&lt;br /&gt;&lt;br /&gt;Above all, nurture the market. I don&#39;t know any comic book fan that opted to photocopy a comic book instead of buying it. They &lt;span style=&quot;font-weight: bold;&quot;&gt;love&lt;/span&gt; comic books. They&#39;re quite fanatical about them. Games need the same kind of fans. And they sure as hell won&#39;t get them by treating their customers like criminals and making games even harder to install.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;And Yet It Moves&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;So, what, piracy is here to stay and we have to learn to deal with it? Is that it? All this writing and I have no groundbreaking solution?&lt;br /&gt;&lt;br /&gt;Yes, that&#39;s more or less it. I did warn you at the beginning that the future is still uncertain. Yet I&#39;m sure it&#39;s not a dark one. Consider the fact that &lt;span style=&quot;font-style: italic;&quot;&gt;Sins of a Solar Empire&lt;/span&gt; has no copy protection and &lt;a href=&quot;http://arstechnica.com/news.ars/post/20080320-pc-game-developer-has-radical-message-ignore-the-pirates.html&quot;&gt;doesn&#39;t seem to need one&lt;/a&gt;. Consider the fact that casual games are successful enough to &lt;a href=&quot;http://blog.wired.com/games/2008/05/rockstar-vp-las.html&quot;&gt;make Rockstar VP nervous&lt;/a&gt;.&lt;br /&gt;&lt;br /&gt;Are &quot;hardcore&quot; PC games a dying market, then? Not at all. There will always be market for them and, if the industry does its homework, it won&#39;t be just a niche market. Though the lessons to learn may be hard, failing to learn them won&#39;t be fun, neither for the industry nor for the gamers.</description><link>http://beardseye.blogspot.com/2008/05/piracy-is-here-to-stay.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>0</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-4813328960411821826</guid><pubDate>Mon, 12 May 2008 22:36:00 +0000</pubDate><atom:updated>2008-07-26T22:52:38.647-04:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">assassin&#39;s creed</category><category domain="http://www.blogger.com/atom/ns#">game design</category><category domain="http://www.blogger.com/atom/ns#">games</category><category domain="http://www.blogger.com/atom/ns#">review</category><title>Assassin&#39;s Creed</title><description>How can you know whether a game is good without playing it? You can&#39;t, really. It wouldn&#39;t be fun if you could, right? Fortunately, there are people out there who play games and then tell you about their experience. Some of them do it professionally, some of them just because they think the world really needs to know about those games. &lt;span style=&quot;font-style: italic;&quot;&gt;Assassin&#39;s Creed&lt;/span&gt; is one of those games that are good enough to make me want to join the latter group. However, unlike most such games, it also has defects that appeal to my inner ranter. Perfect blogging material.&lt;br /&gt;&lt;br /&gt;Just to make it clear, this is not a traditional review, rich in detail and sprinkled with screenshots. My goal is not to give you a detailed description of what you experience as you play &lt;span style=&quot;font-style: italic;&quot;&gt;Assassin&#39;s Creed&lt;/span&gt;. If that&#39;s what you&#39;re looking for, then I recommend the &lt;a href=&quot;http://www.gamespot.com/&quot;&gt;Gamespot&lt;/a&gt; reviews, for &lt;a href=&quot;http://www.gamespot.com/xbox360/action/assassinscreed/review.html?om_act=convert&amp;amp;om_clk=gssummary&amp;amp;tag=summary;review&quot;&gt;Xbox 360&lt;/a&gt;, &lt;a href=&quot;http://www.gamespot.com/ps3/action/assassinscreed/review.html?om_act=convert&amp;amp;om_clk=gssummary&amp;amp;tag=summary;review&quot;&gt;PS3&lt;/a&gt; and &lt;a href=&quot;http://www.gamespot.com/pc/action/assassinscreed/review.html?om_act=convert&amp;amp;om_clk=gssummary&amp;amp;tag=summary;review&quot;&gt;PC&lt;/a&gt;. What I&#39;m focusing on here are the strengths and weaknesses of the game. The idea is to help people who never played it before and people interested in game design.&lt;br /&gt;&lt;br /&gt;Still with me? Okay, let&#39;s delve into &lt;span style=&quot;font-style: italic;&quot;&gt;Assassin&#39;s Creed&lt;/span&gt;.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Stunning Visuals&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;You know how games always look a lot better in reviews and trailers than when you actually play them? I expected the same to happen with &lt;span style=&quot;font-style: italic;&quot;&gt;Assassin&#39;s Creed&lt;/span&gt;. To say that I was surprised is a huge understatement. This game is simply beautiful.&lt;br /&gt;&lt;br /&gt;My PC has a dated CPU -- the installer warned me that I don&#39;t fulfill the requirements -- but my nVidia 7900 GTX more than compensates for it. Even so, I can&#39;t configure the graphics to maximum quality and detail. Or rather, I could if I didn&#39;t mind playing a slideshow instead of a game. As it is, the graphics are still stunning.&lt;br /&gt;&lt;br /&gt;Everything is richly textured and beautifully illuminated. Each city has its own personality. The devastating aftermath of war in Acre makes itself evident not only in its burned houses and damaged city walls, but it also permeates the very atmosphere of the city, thanks to the greyish lighting that stands in contrast to golden colors of Damascus and Jerusalem.&lt;br /&gt;&lt;br /&gt;The animations flow smoothly and every now and then you&#39;ll get a very dramatic angle that showcases your prowess at swordplay during a fight with guards. Of course, sometimes the game will get it wrong and you&#39;ll get a spectacular shot of a tree or a merchant stand between you and the camera.&lt;br /&gt;&lt;br /&gt;Another excellent detail is the blurring effect the game applies when you lock on a target. It&#39;s supposed to make you feel like an assassin focusing intently on his victim and it does a great job.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Parkour&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;When I first saw the word &quot;parkour&quot; my reaction went along the lines of &quot;huh?&quot; It turns out that most of us know of it as &quot;&lt;a href=&quot;http://www.youtube.com/watch?v=3KSr1pozm6Y&quot;&gt;free running&lt;/a&gt;&quot;. It also turns out that it&#39;s not the same thing at all. I won&#39;t pretend I&#39;m an expert and, for that same reason, I won&#39;t explain the difference. You have &lt;a href=&quot;http://en.wikipedia.org/wiki/Parkour&quot;&gt;Wikipedia&lt;/a&gt; for that and you can blame them for any eventual error in it. Suffice it to say that the gameplay of &lt;span style=&quot;font-style: italic;&quot;&gt;Assassin&#39;s Creed&lt;/span&gt; incorporates parkour in a most enjoyable way.&lt;br /&gt;&lt;br /&gt;I remember that one of my friends once proposed to make a game about free running. We shot down that idea for various reasons -- where the hell were we going to get money for an AAA game anyway? -- and one of them was the fact that we thought it wouldn&#39;t be interesting on its own, that it needed something more. I still believe that, especially after playing &lt;span style=&quot;font-style: italic;&quot;&gt;Assassin&#39;s Creed&lt;/span&gt;. There&#39;s something immensely satisfying about ending your crazy race over rooftops with a spectacular pounce and stab that brings your hapless victim down with a startled, strangled cry.&lt;br /&gt;&lt;br /&gt;By the way, if you&#39;re into game development, I highly recommend reading this &lt;a href=&quot;http://www.gamasutra.com/view/feature/3571/game_culture_vultures_parkour.php&quot;&gt;Gamasutra article&lt;/a&gt; on parkour in &lt;span style=&quot;font-style: italic;&quot;&gt;Assassin&#39;s Creed&lt;/span&gt; and &lt;span style=&quot;font-style: italic;&quot;&gt;Crackdown&lt;/span&gt;.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Music and Sound Effects&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/span&gt;When I heard that Jesper Kyd composed the soundtrack for the game I had high hopes and I wasn&#39;t disappointed. I personally think that he didn&#39;t outdo his work in &lt;span style=&quot;font-style: italic;&quot;&gt;Silent Assassin&lt;/span&gt;, but the score is still great: subtle enough not to intrude on your experience, yet dramatic enough to establish the appropriate mood.&lt;br /&gt;&lt;br /&gt;I also liked the way NPCs contribute to the game&#39;s atmosphere: preachers, beggars, merchants and passers-by all have something to say and they do it in a way that successfully recreates the bustle of a living city. Of course, they have their limitations. If you start paying a lot of attention to them, you&#39;ll find them repetitive; but if you focus on your mission, like an assassin is supposed to, they&#39;ll provide a rich auditive tapestry to serve as a background.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Good Plot&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;I firmly believe that games are essentially stories created, in part, by players. Whenever you find yourself gushing with enthusiasm while you describe a match of Unreal Tournament to your friends, you&#39;re telling them a story. That said, games don&#39;t necessarily need a plot. But if you&#39;re going to play an assassin, it&#39;s good to have something to motivate you to kill your victims, right?&lt;br /&gt;&lt;br /&gt;I&#39;m usually quite critical about plots. It comes from being a fan of books in an age where most people read only newspapers or technical books, if that. I am glad to say that Assassin&#39;s Creed has a rather good plot. It&#39;s interesting and it&#39;s not shallow. Not one among the characters is what he or she seems to be at the first glance. There are layers to be peeled and they reveal that the world around you is not black and white, even though it would be a lot easier on your conscience that way.&lt;br /&gt;&lt;br /&gt;Altaïr himself, the character which you control, is a decidedly unpleasant person at the beginning, not likeable at all. You might think it&#39;s dangerous gamble to do that to the character with whom you&#39;re supposed to identify, but it works out surprisingly well.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Major Cliffhanger&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Few are the works that can get away with a cliffhanger. &lt;a href=&quot;http://www.amazon.com/gp/product/0553572946?ie=UTF8&amp;amp;tag=besey-20&amp;amp;linkCode=as2&amp;amp;camp=1789&amp;amp;creative=9325&amp;amp;creativeASIN=0553572946%22%3EEndymion%3C/a%3E%3Cimg%20src=%22http://www.assoc-amazon.com/e/ir?t=besey-20&amp;amp;l=as2&amp;amp;o=1&amp;amp;a=0553572946&quot;&gt;Endymion&lt;/a&gt; pulled it off and lived to tell (the rest of) the tale. &lt;span style=&quot;font-style: italic;&quot;&gt;Assassin&#39;s Creed&lt;/span&gt; tried to pull it off and excelled at pissing me off. And everyone else, judging by the reactions.&lt;br /&gt;&lt;br /&gt;Yes, it&#39;s true that almost all of the mysteries that plagued you during the game are resolved at the end. But that&#39;s not enough for a satisfying ending. I don&#39;t know about people at Ubisoft, but I like &lt;span style=&quot;font-weight: bold;&quot;&gt;closure&lt;/span&gt; at the end of the game. It doesn&#39;t hurt the sequel at all, just look at the Sands of Time. Here&#39;s an idea, Ubisoft: go read Jim Butcher&#39;s article about &lt;a href=&quot;http://jimbutcher.livejournal.com/2007/11/19/&quot;&gt;story climaxes&lt;/a&gt;. Just in case it doesn&#39;t help immediately, the ingredient you&#39;re missing is what he calls &quot;resolution&quot;.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Cheesy Dialogue&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Like I said, I&#39;m a fan of books. That makes me spoiled when it comes to things like plot, dialogue and character development. I&#39;m aware of that and I try to keep it under control when I&#39;m criticizing a game. After all, games are a different medium and one of the worst kinds of game designer is the failed writer.&lt;br /&gt;&lt;br /&gt;That said, I have to draw the line somewhere. In case of &lt;span style=&quot;font-style: italic;&quot;&gt;Assassin&#39;s Creed&lt;/span&gt;, I drew it here:&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;&lt;/span&gt;&lt;blockquote&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Rafiq:&lt;/span&gt; He must be stopped!&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Altaïr:&lt;/span&gt; That&#39;s why I&#39;m here.&lt;/blockquote&gt;I don&#39;t mind a certain amount of cliché in games. I don&#39;t require voice actors to be perfect. But some of Altaïr&#39;s lines are way too cheesy to tolerate. It doesn&#39;t help that Philip Shahbaz, the voice of Altaïr, seems to have exactly one tone of voice. I guess it&#39;s supposed to sound arrogant and menacing, which suits Altaïr&#39;s personality admirably, but even so, you cannot apply it to everything you say. Sooner or later, your players start perceiving you as sulky instead of menacing.&lt;br /&gt;&lt;br /&gt;To be fair, I should state that not all of the dialogue is cheesy. Al Mualim has some excellent lines and his voice, Peter Renaday, did a very good job.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;In-yer-face Interface&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Just as I started with the most gushing praise of the best features, so I kept the worst for the last. Many reviews complain about the repetitive nature of the gameplay, but I don&#39;t find that to be such a big problem. Solving it would have been quite costly and the developers did their best to mitigate it by offering the player a variety of investigation missions and giving him the liberty to choose which to play and which to skip.&lt;br /&gt;&lt;br /&gt;The worst problem, however, comes from some monumentally bad decisions that were perfectly avoidable. All of them have to do with the interface, but the crown jewel among them is the decision to give the player the freedom to move during the cut scenes.&lt;br /&gt;&lt;br /&gt;Picture the following situation: After doing lots of investigative work that helped you decided where, when and how you&#39;ll assassinate your victim, you finally have him in sight. He&#39;s talking to someone else and nobody is paying any attention to you at all. You&#39;re free to move, so you start creeping towards his back. You&#39;re in position, the moment is perfect and you press the button. And nothing happens. You press the button repeatedly, muttering &quot;stab him, dammit!&quot; It&#39;s useless. You&#39;re in a cut scene and nothing you do will have any effect. The developers gave you a completely useless freedom and the only thing they achieved is to confuse you.&lt;br /&gt;&lt;br /&gt;And they took away a number of very useful freedoms from you, such as the freedom to skip a cut scene or a tutorial. It&#39;s an old &lt;a href=&quot;http://blogs.msdn.com/shawnhar/archive/2007/05/24/transitions-concluded-there-is-no-spoon.aspx&quot;&gt;lesson&lt;/a&gt;, but people at Ubisoft haven&#39;t learned it. They didn&#39;t let you skip the cutscenes in &lt;span style=&quot;font-style: italic;&quot;&gt;Prince of Persia&lt;/span&gt; and they don&#39;t let you do it  in &lt;span style=&quot;font-style: italic;&quot;&gt;Assassin&#39;s Creed&lt;/span&gt; either. Come on, people, it&#39;s a &lt;a href=&quot;http://www.designersnotebook.com/Columns/062_Bad_Game_Designer_V/movies062_bad_game_designer_v.htm&quot;&gt;Twinkie Denial Condition&lt;/a&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;&lt;/span&gt;&lt;a href=&quot;http://www.designersnotebook.com/Design_Resources/No_Twinkie_Database/no_twinkie_database.htm&quot;&gt;&lt;/a&gt;!&lt;br /&gt;&lt;br /&gt;They also took away your freedom to load and save as you wish, which is another Twinkie Denial Condition. While I appreciate that the game saves automatically whenever I finish a task or get to a checkpoint, I still hate it that I have to let the guards kill me if my assassination attempt didn&#39;t go the way I wanted, so that the game will reload the last save.&lt;br /&gt;&lt;br /&gt;Then there&#39;s also the matter of quitting the game or switching the profile. To exit the game, I first have to exit &lt;span style=&quot;font-style: italic;&quot;&gt;to&lt;/span&gt; the Animus, then exit &lt;span style=&quot;font-style: italic;&quot;&gt;from&lt;/span&gt; the Animus, then &quot;quit&quot; the game, then select a profile (which one? doesn&#39;t matter, I want to quit) and then finally exit the game. Oh, and I have to confirm that I wish to exit, just in case I went through all that work by accident. Yes, I know this is probably only a PC issue and that the PC version suffers a lot more from not being properly adapted to keyboard and mouse, but it&#39;s still incredibly annoying. The keyboard and mouse problem can at least be solved, by plugging in a gamepad.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Conclusion&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Don&#39;t get me wrong. I loved &lt;span style=&quot;font-style: italic;&quot;&gt;Assassin&#39;s Creed&lt;/span&gt; and I can&#39;t wait for the sequel, even though Philip Shahbaz will &lt;a href=&quot;http://www.imdb.com/title/tt1201133/&quot;&gt;again&lt;/a&gt; be Altaïr&#39;s voice. I mean, let&#39;s give the guy a second chance, right? Seriously, though, I loved the game. Despite the obvious and perfectly avoidable problems, I recommend it warmly.&lt;br /&gt;&lt;br /&gt;And if Ubisoft learns from their mistakes, the sequel should be even better.</description><link>http://beardseye.blogspot.com/2008/05/assassins-creed.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>0</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-5841664787360446638</guid><pubDate>Mon, 12 May 2008 18:05:00 +0000</pubDate><atom:updated>2008-05-12T14:08:37.846-04:00</atom:updated><title>Back in Blog</title><description>Well, I think I finally got the hang of this being-a-dad thing; at least enough to get back to blogging. Suuuure, blame it all on the baby, that&#39;s convenient. Heh.&lt;br /&gt;&lt;br /&gt;Anyway, things have been hectic in my life, but I&#39;m back and I have stuff to rant about again. That said, I&#39;m probably going to blog less code and include more variety. Oh and the posts are probably going to be shorter. Like this one. Not everything has to be a great story or an essay, right?</description><link>http://beardseye.blogspot.com/2008/05/back-in-blog.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>0</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-653886424202405375</guid><pubDate>Tue, 06 Nov 2007 22:40:00 +0000</pubDate><atom:updated>2011-11-07T22:49:58.351-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">.NET</category><category domain="http://www.blogger.com/atom/ns#">.NUTS</category><category domain="http://www.blogger.com/atom/ns#">boxing</category><category domain="http://www.blogger.com/atom/ns#">generators</category><category domain="http://www.blogger.com/atom/ns#">iterators</category><category domain="http://www.blogger.com/atom/ns#">performance</category><category domain="http://www.blogger.com/atom/ns#">profiling</category><category domain="http://www.blogger.com/atom/ns#">yield</category><title>.NUTS: Yield Not, Waste Not</title><description>Everyone agrees that iterators are cool. They&#39;re so cool, they&#39;ve got their own design pattern. Even cooler than that, they made the folks at Sun introduce some (gasp!) new syntax into Java. Not to be outdone, C# goes even further: not only can you use iterators easily, you can also roll your own generators using the &lt;span class=&quot;code&quot;&gt;yield&lt;/span&gt; keyword. How cool is that?&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;To Litter or Not to Litter&lt;br /&gt;&lt;/span&gt;As it turns out, it&#39;s not as cool as it sounds. If you care about memory allocation and garbage, then using &lt;span class=&quot;code&quot;&gt;yield&lt;/span&gt; is a bad idea. Before I go any further, let&#39;s explore this concern itself. Why would anyone care about memory allocation in a language with a garbage collector? Isn&#39;t the whole purpose of the garbage collector to abstract away memory allocation and help avoid all those nasty pointer-related bugs? If you find yourself nodding along to these questions, then I suggest that you read Joel&#39;s &lt;a href=&quot;http://www.joelonsoftware.com/articles/LeakyAbstractions.html&quot;&gt;Law of Leaky Abstractions&lt;/a&gt;.&lt;br /&gt;
&lt;br /&gt;
The reasons to care about garbage are diverse. In my case, it&#39;s because I&#39;m working with XNA. That means that if I make too much garbage at the wrong time, the garbage collector will fire up and mess up my frame rate. As Shawn Hargreaves explained, there are &lt;a href=&quot;http://blogs.msdn.com/shawnhar/archive/2007/07/02/twin-paths-to-garbage-collector-nirvana.aspx&quot;&gt;two ways&lt;/a&gt; to prevent this. Myself, I prefer what he calls the Right Frequency Path, which is why using yield is a bad idea for me.&lt;br /&gt;
&lt;br /&gt;
There are other kinds of software, where garbage might not be as critical an issue as in games, but you should still be aware of it. Too much garbage can cripple the performance of any system.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Garbage Generators&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
So far I have only &lt;span style=&quot;font-style: italic;&quot;&gt;claimed&lt;/span&gt; that &lt;span class=&quot;code&quot;&gt;yield&lt;/span&gt; is wasteful. Let&#39;s take a look at why it&#39;s so. Consider this short program:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;using System;
using System.Collections.Generic;
using System.Text;

namespace Sandbox
{
 static class IteratorTest
 {
     public static IEnumerable&amp;lt;int&amp;gt; FibonacciYield(int max)
     {
         if (max &amp;gt; 1)
         {
             int a = 1;
             int b = 1;

             yield return a;
             yield return b;

             int c = a + b;

             while (c &amp;lt; max)
             {
                 yield return c;
                 a = b;
                 b = c;
                 c = a + b;
             }
         }
     }
 }

 class Program
 {
     static void Main(string[] args)
     {
         int sum = 0;
         for (int l = 0; l &amp;lt; 100000; l++)
         {
             foreach (int i in IteratorTest.FibonacciYield(17))
             {
                 sum += i;
             }
         }
         Console.WriteLine(sum);
     }
 }
}&lt;/pre&gt;
&lt;br /&gt;
Running CLR Profiler on this program might surprise you. Although there is no explicit allocation in the code, plenty of memory winds up in the landfill. The culprit is a class with a weird name: &lt;span class=&quot;code&quot;&gt;&amp;lt;FibonacciYield&amp;gt;d__0&lt;/span&gt;. In this simple example, it is responsible for a whopping 3.4 MB garbage pile. Where did that come from?&lt;br /&gt;
&lt;br /&gt;
That&#39;s the price one has to pay for the magic. After all, when you write your own generator with &lt;span class=&quot;code&quot;&gt;yield&lt;/span&gt;, you&#39;re returning &lt;span class=&quot;code&quot;&gt;IEnumerable&amp;lt;T&amp;gt;&lt;/span&gt;, without even instancing it, let alone implementing it. But the return value has to come from somewhere and that somewhere is a compiler-generated class. If you run Reflector on the program and then take a look at the method &lt;span class=&quot;code&quot;&gt;FibonacciYield&lt;/span&gt;, you&#39;ll get this:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;public static IEnumerable&amp;lt;int&amp;gt; FibonacciYield(int max)
{
  &amp;lt;FibonacciYield&amp;gt;d__0 d__ = new &amp;lt;FibonacciYield&amp;gt;d__0(-2);
  d__.&amp;lt;&amp;gt;3__max = max;
  return d__;
}&lt;/pre&gt;
&lt;br /&gt;
Just as suspected, that&#39;s where and how allocation happens. But what about our mysterious compiler-generated class with an ugly name?&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Under The Hood&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
The dissasembly of the &lt;span class=&quot;code&quot;&gt;&amp;lt;FibonacciYield&amp;gt;d__0&lt;/span&gt; class shows not only an implementation of &lt;span class=&quot;code&quot;&gt;IEnumerable&amp;lt;int&amp;gt;&lt;/span&gt; interface, but also a full-fledged &lt;span class=&quot;code&quot;&gt;IEnumerator&amp;lt;int&amp;gt;&lt;/span&gt; implementation. It has all the stuff a generator should have: the &lt;span class=&quot;code&quot;&gt;Current&lt;/span&gt; property, the &lt;span class=&quot;code&quot;&gt;Reset&lt;/span&gt; method and the &lt;span class=&quot;code&quot;&gt;MoveNext&lt;/span&gt; method.&lt;br /&gt;
&lt;br /&gt;
At first, the code might look intimidating -- all &lt;span style=&quot;font-style: italic;&quot;&gt;this&lt;/span&gt; came out of &lt;span style=&quot;font-style: italic;&quot;&gt;that&lt;/span&gt; short method?! -- but a closer look reveals certain patterns. For example, take a look at the fields:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;private int &amp;lt;&amp;gt;1__state;
private int &amp;lt;&amp;gt;2__current;
public int &amp;lt;&amp;gt;3__max;
public int &amp;lt;a&amp;gt;5__1;
public int &amp;lt;b&amp;gt;5__2;
public int &amp;lt;c&amp;gt;5__3;
public int max;&lt;/pre&gt;
&lt;br /&gt;
The &lt;span class=&quot;code&quot;&gt;&amp;lt;1&amp;gt;__state&lt;/span&gt; and &lt;span class=&quot;code&quot;&gt;&amp;lt;2&amp;gt;__current&lt;/span&gt; fields have to do with how the &lt;span class=&quot;code&quot;&gt;yield&lt;/span&gt; keyword itself works and I&#39;ll get back to them in a few moments. The rest show an interesting pattern: the local variables in the original method turn into &lt;span class=&quot;code&quot;&gt;&amp;lt;a&amp;gt;5__1&lt;/span&gt;, &lt;span class=&quot;code&quot;&gt;&amp;lt;b&amp;gt;5__2&lt;/span&gt; and &lt;span class=&quot;code&quot;&gt;&amp;lt;c&amp;gt;5__3&lt;/span&gt; fields, whereas the &lt;span class=&quot;code&quot;&gt;max&lt;/span&gt; argument becomes &lt;span class=&quot;code&quot;&gt;&amp;lt;&amp;gt;3__max&lt;/span&gt; field. But what about the &lt;span class=&quot;code&quot;&gt;max&lt;/span&gt; field? Why have both &lt;span class=&quot;code&quot;&gt;max&lt;/span&gt; and &lt;span class=&quot;code&quot;&gt;&amp;lt;&amp;gt;3__max&lt;/span&gt;? And why convert local variables into fields in the first place?&lt;br /&gt;
&lt;br /&gt;
To answer that, remember how the &lt;span class=&quot;code&quot;&gt;foreach&lt;/span&gt; loop works: by calling the &lt;span class=&quot;code&quot;&gt;MoveNext&lt;/span&gt; method of the &lt;span class=&quot;code&quot;&gt;IEnumerator&amp;lt;T&amp;gt;&lt;/span&gt; interface and inspecting its &lt;span class=&quot;code&quot;&gt;Current&lt;/span&gt; property, until &lt;span class=&quot;code&quot;&gt;MoveNext&lt;/span&gt; returns false. This means that the state of whatever you&#39;re doing in your generator has to be preserved between two calls to &lt;span class=&quot;code&quot;&gt;MoveNext&lt;/span&gt;. In our case, the local variables of the original method are a part of the state that has to be preserved.&lt;br /&gt;
&lt;br /&gt;
Speaking of state, the magic is explained when you look at the implementation of the &lt;span class=&quot;code&quot;&gt;MoveNext&lt;/span&gt; method:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;private bool MoveNext()
{
   switch (this.&amp;lt;&amp;gt;1__state)
   {
       case 0:
           this.&amp;lt;&amp;gt;1__state = -1;
           if (this.max &amp;lt;= 1)
           {
               break;
           }
           this.&amp;lt;a&amp;gt;5__1 = 1;
           this.&amp;lt;b&amp;gt;5__2 = 1;
           this.&amp;lt;&amp;gt;2__current = this.&amp;lt;a&amp;gt;5__1;
           this.&amp;lt;&amp;gt;1__state = 1;
           return true;

       case 1:
           this.&amp;lt;&amp;gt;1__state = -1;
           this.&amp;lt;&amp;gt;2__current = this.&amp;lt;b&amp;gt;5__2;
           this.&amp;lt;&amp;gt;1__state = 2;
           return true;

       case 2:
           this.&amp;lt;&amp;gt;1__state = -1;
           this.&amp;lt;c&amp;gt;5__3 = this.&amp;lt;a&amp;gt;5__1 + this.&amp;lt;b&amp;gt;5__2;
           while (this.&amp;lt;c&amp;gt;5__3 &amp;lt; this.max)
           {
               this.&amp;lt;&amp;gt;2__current = this.&amp;lt;c&amp;gt;5__3;
               this.&amp;lt;&amp;gt;1__state = 3;
               return true;
           Label_00C5:
               this.&amp;lt;&amp;gt;1__state = -1;
               this.&amp;lt;a&amp;gt;5__1 = this.&amp;lt;b&amp;gt;5__2;
               this.&amp;lt;b&amp;gt;5__2 = this.&amp;lt;c&amp;gt;5__3;
               this.&amp;lt;c&amp;gt;5__3 = this.&amp;lt;a&amp;gt;5__1 + this.&amp;lt;b&amp;gt;5__2;
           }
           break;

       case 3:
           goto Label_00C5;
   }
   return false;
}&lt;/pre&gt;
&lt;br /&gt;
This is the heart of the compiler-generated finite state machine designed to react correctly to subsequent calls to &lt;span class=&quot;code&quot;&gt;MoveNext&lt;/span&gt;. The initial state is 0. Every time the compiler encounters a &lt;span class=&quot;code&quot;&gt;yield return&lt;/span&gt; statement, it creates a new state and generates the code that:&lt;br /&gt;
&lt;ol&gt;
&lt;li&gt;sets &lt;span class=&quot;code&quot;&gt;&amp;lt;&amp;gt;1__state&lt;/span&gt; to the new state&lt;/li&gt;
&lt;li&gt;sets &lt;span class=&quot;code&quot;&gt;&amp;lt;&amp;gt;2__current&lt;/span&gt; to the value in yield return&lt;/li&gt;
&lt;li&gt;returns true.&lt;/li&gt;
&lt;/ol&gt;
By the way, don&#39;t try to plug this code back into the sample program. The C# compiler will reject it, for reasons about which I will rant in another post.&lt;br /&gt;
&lt;br /&gt;
The final mystery is resolved by looking at the &lt;span class=&quot;code&quot;&gt;GetEnumerator&lt;/span&gt; method implementation:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;IEnumerator&amp;lt;int&amp;gt; IEnumerable&amp;lt;int&amp;gt;.GetEnumerator()
{
   IteratorTest.&amp;lt;FibonacciYield&amp;gt;d__0 d__;
   if (Interlocked.CompareExchange(ref this.&amp;lt;&amp;gt;1__state, 0, -2) == -2)
   {
       d__ = this;
   }
   else
   {
       d__ = new IteratorTest.&amp;lt;FibonacciYield&amp;gt;d__0(0);
   }
   d__.max = this.&amp;lt;&amp;gt;3__max;
   return d__;
}&lt;/pre&gt;
&lt;br /&gt;
After doing some tedious deciphering, this turns out to mean the following: if &lt;span class=&quot;code&quot;&gt;MoveNext&lt;/span&gt; That has already been called on this particular object, then create a fresh clone; if not, return this object. That&#39;s why we have both &lt;span class=&quot;code&quot;&gt;&amp;lt;&amp;gt;3__max&lt;/span&gt; and &lt;span class=&quot;code&quot;&gt;max&lt;/span&gt;: since arguments can be used as local variables within a method, the former is the original value supplied to &lt;span class=&quot;code&quot;&gt;FibonacciYield&lt;/span&gt; and the latter serves as a &quot;local variable&quot; field.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Making It Tidy&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
Now that we know how &lt;span class=&quot;code&quot;&gt;yield&lt;/span&gt; works, we can concentrate on avoiding memory allocation. Fortunately, C# allows you to write generators without implementing the &lt;span class=&quot;code&quot;&gt;IEnumerator&amp;lt;T&amp;gt;&lt;/span&gt; interface, as documented in help file:&lt;br /&gt;
&lt;blockquote&gt;
In C#, it is not strictly necessary for a collection class to inherit from IEnumerable and IEnumerator in order to be compatible with &lt;span style=&quot;font-weight: bold;&quot;&gt;foreach&lt;/span&gt;; as long as the class has the required GetEnumerator, MoveNext, Reset, and Current members, it will work with &lt;span style=&quot;font-weight: bold;&quot;&gt;foreach&lt;/span&gt;. Omitting the interfaces has the advantage of allowing you to define the return type of Current to be more specific than &lt;span style=&quot;font-weight: bold;&quot;&gt;object&lt;/span&gt;, thereby providing type-safety.&lt;/blockquote&gt;
&lt;br /&gt;
Omitting the interfaces, in this case, has the added advantage of being able to implement a generator as a struct, without it getting boxed later. Your final generator would look like this:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;public struct ScroogeGenerator
{
    private int state;
    private int current;
    private int max;
    private int a;
    private int b;
    private int c;

    public ScroogeGenerator(int max)
    {
        this.state = 0;
        this.max = max;
        current = a = b = c = 0;
    }

    public int Current
    {
        get { return current; }
    }

    public bool MoveNext()
    {
        switch (state)
        {
            case 0:
                if (max &amp;lt;= 1) break;
                a = 1;
                b = 1;
                state = 1;
                current = a;
                return true;
            case 1:
                state = 2;
                current = b;
                return true;
            case 2:
                c = a + b;
                if (c &amp;gt;= max) break;
                a = b;
                b = c;
                current = c;
                return true;
        }
        return false;
    }

    public void Reset()
    {
        throw new Exception(&quot;The method or operation is not implemented.&quot;);
    }

    public ScroogeGenerator GetEnumerator() { return this; }
}&lt;/pre&gt;
&lt;br /&gt;
A few things to keep in mind:&lt;br /&gt;
&lt;ul&gt;
&lt;li&gt;When you write your own value-type generator, make sure you never cast it into an interface, or it will get boxed and make garbage. The best way to ensure that is not to implement any of the &lt;span class=&quot;code&quot;&gt;IEnumerable&lt;/span&gt; or &lt;span class=&quot;code&quot;&gt;IEnumerator&lt;/span&gt; interfaces.&lt;/li&gt;
&lt;li&gt;The C# &lt;span class=&quot;code&quot;&gt;foreach&lt;/span&gt; loop does not call the &lt;span class=&quot;code&quot;&gt;Reset&lt;/span&gt; method. You can leave it unimplemented or empty if you&#39;ll use your generator in C# &lt;span class=&quot;code&quot;&gt;foreach&lt;/span&gt; loops only.&lt;/li&gt;
&lt;li&gt;Don&#39;t reuse an instance of a value-type generator across different &lt;span class=&quot;code&quot;&gt;foreach&lt;/span&gt; loops. If you intend to do that, implement &lt;span class=&quot;code&quot;&gt;Reset&lt;/span&gt; and call it manually before reusing the generator.&lt;/li&gt;
&lt;/ul&gt;</description><link>http://beardseye.blogspot.com/2007/11/nuts-yield-not-waste-not.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>1</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-5448520732582509696</guid><pubDate>Wed, 19 Sep 2007 23:34:00 +0000</pubDate><atom:updated>2011-11-07T22:50:46.030-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">.NET</category><category domain="http://www.blogger.com/atom/ns#">.NUTS</category><category domain="http://www.blogger.com/atom/ns#">collections</category><category domain="http://www.blogger.com/atom/ns#">inheritance</category><category domain="http://www.blogger.com/atom/ns#">interfaces</category><category domain="http://www.blogger.com/atom/ns#">polymorphism</category><title>.NUTS: Cemented Collections</title><description>Another month has passed. My intention was to write another post sooner -- especially because I already had material for it -- but time flies when one is immersed in work. Now that I have some free time, I can sit down and share another one of my .NET anecdotes. And so, without further ado, welcome back to my .NUTS column.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Deja Vu&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
If there&#39;s one thing that defines a modern language as much as its syntax, it&#39;s the way it handles collections. Therefore, it shouldn&#39;t be such a big surprise that most of my encounters with .NET quirks stem from the differences between its collection classes and those I&#39;ve used in Java. Still, I hope this is won&#39;t turn out to be a tradition, otherwise I will have to rename this column to &quot;Collection Corner&quot;.&lt;br /&gt;
&lt;br /&gt;
This particular tale, just like the last one, starts with the &lt;span class=&quot;code&quot;&gt;Dictionary&lt;/span&gt; generic class. Specifically, it starts with my need to be informed when someone manipulates its contents.  The classes I was writing had a rather simple purpose: associate event handlers with different input controller events. One class managed keyboard events, another mouse events and yet another gamepad events. Each one of these classes exposed at least one read-only property of appropriate &lt;span class=&quot;code&quot;&gt;Dictionary&lt;/span&gt; type. All went smoothly until I decided that I needed to know when someone added, removed or changed the handler for an event.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Keep Me Posted&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
The first logical thing to do was to check the documentation and see whether the &lt;span class=&quot;code&quot;&gt;Dictionary&lt;/span&gt; class itself offered the desired functionality. Despite my years of experience with Java collection framework, I hoped Microsoft would surprise me pleasantly. Good thing that I didn&#39;t bet on it, though. After all these years of evolution, the two most sophisticated mainstream languages still don&#39;t offer the fundamental courtesy provided by Delphi&#39;s good old &lt;span class=&quot;code&quot;&gt;TStringList&lt;/span&gt; class.&lt;br /&gt;
&lt;br /&gt;
Just like I expected, I decided I would have to implement the desired behavior myself. The logical choice seemed to be to override whatever manipulates the collections contents: the &lt;span class=&quot;code&quot;&gt;Add&lt;/span&gt;, &lt;span class=&quot;code&quot;&gt;Remove&lt;/span&gt; and &lt;span class=&quot;code&quot;&gt;Clear&lt;/span&gt; methods, and the indexer. If you have any .NET experience, you&#39;re probably chuckling at my naïveté by now; if you don&#39;t, then I&#39;ll keep you in suspense a bit longer.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Virtually Polymorphic&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
Perhaps the greatest difference between Java and .NET, certainly the one I find most difficult to get used to, is the way they handle inheritance polymorphism. In Java, all non-private methods are virtual by default. If you don&#39;t want the derived classes to override a method, you have to explicitly declare it as final. Once a method is declared as final, your derived classes cannot contain a method with the same name.&lt;br /&gt;
&lt;br /&gt;
In .NET, all methods are non-virtual by default. If you want a method to be overridable, you have to explicitly declare it as virtual. If you&#39;re overriding a method, you have to include the &lt;span class=&quot;code&quot;&gt;override&lt;/span&gt; keyword. If you want to keep a virtual method from being overridden, you have to declare it as &lt;span class=&quot;code&quot;&gt;sealed&lt;/span&gt;. And if you want to introduce a new method with the same name in the derived class, you have to use the &lt;span class=&quot;code&quot;&gt;new&lt;/span&gt; keyword.&lt;br /&gt;
&lt;br /&gt;
At first, this seems like a lot of fuss to achieve virtually the same results, pardon the pun. However, these differences are extremely important, as they represent different design philosophies. The .NET way makes you think twice before you decide which parts of your code will be flexible and which set in stone. As Eric Lippert will &lt;a href=&quot;http://blogs.msdn.com/ericlippert/archive/2004/01/07/virtual-methods-and-brittle-base-classes.aspx&quot;&gt;tell&lt;/a&gt; you, this mitigates the notorious &lt;a href=&quot;http://en.wikipedia.org/wiki/Fragile_base_class&quot;&gt;Fragile Base Class problem&lt;/a&gt;, which &lt;span style=&quot;font-style: italic;&quot;&gt;should&lt;/span&gt; be cool, right?&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;No Jack For Plug&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
The aforementioned philosophical differences are not all about how many polymorphic angels can dance on the head of a non-private pin. There are some very practical consequences of making all your methods non-virtual by default. One of these consequences is rather unfortunate: none of the methods in the &lt;span class=&quot;code&quot;&gt;Dictionary&lt;/span&gt; class are virtual. In other words, there&#39;s no way to modify the behavior of the &lt;span class=&quot;code&quot;&gt;Dictionary&lt;/span&gt; class so that you get a notification if its contents change. Or is there?&lt;br /&gt;
&lt;br /&gt;
Everyone is really excited about the shiny new features in C# 3, such as &lt;a href=&quot;http://community.bartdesmet.net/blogs/bart/archive/2007/07/28/c-3-0-partial-methods-what-why-and-how.aspx&quot;&gt;partial methods&lt;/a&gt;. It will be interesting to see whether the collection classes will benefit from this change. I haven&#39;t downloaded Orcas yet, so I don&#39;t know. I won&#39;t hold my breath, though. Besides, a new version of the language and the surrounding framework won&#39;t help those who need to extend the collection classes &lt;span style=&quot;font-weight: bold;&quot;&gt;now&lt;/span&gt;. What solution could they use?&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Interface/Off&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
Fortunately, there is a way to hack around both the language limitations and the rigid design of the collection classes, without too much work. The key is to fall back to the interface polymorphism. Where before I exposed my dictionary as a read-only property of generic &lt;span class=&quot;code&quot;&gt;Dictionary&lt;/span&gt; type, now I have to change its type to generic &lt;span class=&quot;code&quot;&gt;IDictionary&lt;/span&gt; interface.&lt;br /&gt;
&lt;br /&gt;
The second step is to include the &lt;span class=&quot;code&quot;&gt;IDictionary&lt;/span&gt; interface in the list of base types for my dictionary class, like this:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;class MyDictionary&amp;lt;TKey, TValue&amp;gt; : Dictionary&amp;lt;TKey, TValue&amp;gt;, IDictionary&amp;lt;TKey, TValue&amp;gt;
{
// ...
}&lt;/pre&gt;
&lt;br /&gt;
How does this help? It makes sure that the compiler will remap the interface methods to the corresponding methods in the class. If there is a method in &lt;span class=&quot;code&quot;&gt;MyDictionary&lt;/span&gt; whose signature corresponds to an &lt;span class=&quot;code&quot;&gt;IDictionary&lt;/span&gt; method, it will be mapped to it; if not, an inherited method will be used.&lt;br /&gt;
&lt;br /&gt;
The final step is to declare the methods you want to modify as new methods:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;public new void Add(TKey key, TValue value)
{
   // ...
   base.Add(key, value);
   // ...
}&lt;/pre&gt;
&lt;br /&gt;
That&#39;s all there is to it, really. Some things haven&#39;t changed that much since the bad old times of COM and COM+.</description><link>http://beardseye.blogspot.com/2007/09/nuts-cemented-collections.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>1</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-3039360670786330815</guid><pubDate>Mon, 13 Aug 2007 15:13:00 +0000</pubDate><atom:updated>2011-11-07T22:51:34.657-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">.NET</category><category domain="http://www.blogger.com/atom/ns#">.NUTS</category><category domain="http://www.blogger.com/atom/ns#">boxing</category><category domain="http://www.blogger.com/atom/ns#">enums</category><category domain="http://www.blogger.com/atom/ns#">generics</category><category domain="http://www.blogger.com/atom/ns#">performance</category><category domain="http://www.blogger.com/atom/ns#">profiling</category><title>.NUTS: Enum Conundrum</title><description>&lt;span style=&quot;font-weight: bold;&quot;&gt;Welcome to .NUTS&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
I&#39;ve been avoiding .NET for quite a long time. It&#39;s not that I&#39;m an anti-Microsoft zealot, it&#39;s just that I used to code in Java at work and I figured that one platform full of frustrating limitations was more than enough. Recently, however, I decided to learn XNA on my own free time and that means getting involved with .NET and C#. It turned out to be just as frustrating as I feared, but at least it provided me with some nice blog material. Thus is the .NUTS &quot;column&quot; born in my blog, to document all that is weird in the .NET world. Expect stuff that might make you giggle, goggle and groan.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Enums Revisited&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
I first encountered the concept of enumerated types when I learned Pascal. Before that I owned a Commodore, so I dabbled in Simons&#39; BASIC and 6502 assembler. Therefore, I didn&#39;t approach enums the way a C programmer might -- as a cleaner way of defining constants -- but as a way of defining a completely new type that accepts only those identifiers I define. The difference is subtle, but important: it shapes your attitude.&lt;br /&gt;
&lt;br /&gt;
In .NET, enums have, obviously, inherited a lot of baggage from COM, which, in turn, comes from C. Furthermore, that baggage has been squeezed to fit the .NET luggage rack, where everything is an object, even the value types. Of course, value types are not really first-class objects, because they have to be boxed first. All in all, .NET enums are a part of a very &lt;a href=&quot;http://www.joelonsoftware.com/articles/LeakyAbstractions.html&quot;&gt;leaky abstraction&lt;/a&gt;.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Starting Point&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
My journey of discovery started with an innocent little detail inside an excellent &lt;a href=&quot;http://blogs.msdn.com/shawnhar/archive/2007/07/02/twin-paths-to-garbage-collector-nirvana.aspx&quot;&gt;post&lt;/a&gt; on &lt;a href=&quot;http://blogs.msdn.com/shawnhar/default.aspx&quot;&gt;Shawn Hargreaves&#39; blog&lt;/a&gt;. If you haven&#39;t read it yet, I strongly recommend it. I found it quite enlightening, especially the following advice:&lt;br /&gt;
&lt;blockquote&gt;
If you use an enum type as a dictionary key, internal dictionary operations will cause boxing. You can avoid this by using integer keys, and casting your enum values to ints before adding them to the dictionary.&lt;/blockquote&gt;
I found it strange that ints don&#39;t get boxed, while enums do. However, I was in a hurry to cobble my prototype together, so I just wrote it off as another one of those language eccentricities one encounters occasionally.&lt;br /&gt;
&lt;br /&gt;
A couple of weeks later, I mentioned this to a friend of mine, who has been working in .NET since it first came out. He was even more flabbergasted: &quot;That&#39;s weird. I mean, Microsoft keeps harping on how generic collections allow you to avoid unnecessary boxing. Seeing as how enum is one of the most common types for dictionary keys, the whole thing doesn&#39;t make sense at all.&quot;&lt;br /&gt;
&lt;br /&gt;
That got me even more intrigued, enough to overcome my laziness and make me start poking around. This is where things started getting really interesting.&lt;br /&gt;
&lt;br /&gt;
Now, those of you who don&#39;t care about the why and want the solution, skip to the part called &quot;Finish Line&quot;, down below, near the end of this post.&lt;br /&gt;
&lt;br /&gt;
Still reading? Okay, let&#39;s roll up those sleeves...&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Stage 1: Observation&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
The obvious first step was to confirm Shawn&#39;s claim and work from there. For that, I armed myself with CLR Profiler (thanks to &lt;a href=&quot;http://blogs.msdn.com/jmstall/archive/2005/12/17/CLR-profiler-2-0-available.aspx&quot;&gt;Mike Stall&#39;s blog&lt;/a&gt;) and I hammered out the following piece of code:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;using System;
using System.Collections.Generic;
using System.Text;

namespace Sandbox
{
    enum TestEnum
    {
        e10,
        e9,
        e8,
        e7,
        e6,
        e5,
        e4,
        e3,
        e2,
        e1
    }

    class Program
    {
        static void Main(string[] args)
        {
            Dictionary&amp;lt;TestEnum, int&amp;gt; dict = new Dictionary&amp;lt;TestEnum, int&amp;gt;();

            for (int l = 0; l &amp;lt; 100000; l++)
            {
                TestEnum x = (TestEnum) (l % 10);
                dict[x] = 100000 - (int) x;
            }

            for (TestEnum x = TestEnum.e10; x &amp;lt;= TestEnum.e1; x++)
            {
                Console.WriteLine(dict[x]);
            }
        }
    }
}&lt;/pre&gt;
The results were as bad as Shawn predicted: the code allocated 4,825,246 bytes (more than 4.5 MB), out of which 74.61% were instances of &lt;span class=&quot;code&quot;&gt;Sandbox.TestEnum&lt;/span&gt; type. If an enum is winding up on the heap like that, it must be getting boxed.&lt;br /&gt;
&lt;br /&gt;
The next thing I did was to change the &lt;span class=&quot;code&quot;&gt;Main&lt;/span&gt; method to look like this:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;        static void Main(string[] args)
        {
            Dictionary&amp;lt;int, int&amp;gt; dict = new Dictionary&amp;lt;int, int&amp;gt;();

            for (int l = 0; l &amp;lt; 100000; l++)
            {
                int x = l % 10;
                dict[x] = 100000 - (int) x;
            }

            for (int x = 0; x &amp;lt; 10; x++)
            {
                Console.WriteLine(dict[x]);
            }
        }&lt;/pre&gt;
The profiler reported 24,692 allocated bytes, with nothing sticking out as especially wasteful. It was clear that .NET handled ints and enums differently, even though they are both value types.&lt;br /&gt;
&lt;br /&gt;
CLR Profiler has a very neat feature that allows you to find out which part(s) of the code allocated the memory, by right-clicking on the offending bar in the histogram view and selecting the &quot;Show Who Allocated&quot; option. That will open the allocation graph view, which in my enum case looked like this:&lt;br /&gt;
&lt;div style=&quot;text-align: center;&quot;&gt;
&lt;a aiotitle=&quot;&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipRHy_z_Pop4FOQZLRX-oAmKYtnNcjOr8Nb3F4Fdyw7QTIwZbjHf2IWFJQEW2tAIjurFJyrN4_n6AuYszya_K_cfH2beosMXRLB0WIhay1stWgb_hixmwjyQY9lwHoe0rp-s09q_WKXtU/s1600-h/enum_allocation_graph.PNG&quot;&gt;&lt;img alt=&quot;&quot; border=&quot;0&quot; id=&quot;BLOGGER_PHOTO_ID_5098314440141873762&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipRHy_z_Pop4FOQZLRX-oAmKYtnNcjOr8Nb3F4Fdyw7QTIwZbjHf2IWFJQEW2tAIjurFJyrN4_n6AuYszya_K_cfH2beosMXRLB0WIhay1stWgb_hixmwjyQY9lwHoe0rp-s09q_WKXtU/s400/enum_allocation_graph.PNG&quot; style=&quot;cursor: pointer; display: block; margin: 0px auto 10px; text-align: center;&quot; /&gt;&lt;/a&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;(click on the image to enlarge)&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;
Apparently, my enum was being boxed by &lt;span class=&quot;code&quot;&gt;ObjectEqualityComparer&lt;/span&gt;, inside the &lt;span class=&quot;code&quot;&gt;Insert&lt;/span&gt; method in the generic &lt;span class=&quot;code&quot;&gt;Dictionary&lt;/span&gt; class. Culprit identified, time to move on to the next stage.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Stage 2: Under the Hood&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
Contrary to my first impulse -- whip out the IL Disassembler -- I decided to first check whether the help files mentioned &lt;span class=&quot;code&quot;&gt;ObjectEqualityComparer&lt;/span&gt;. No such luck. ILDASM it is, then. The disassembly of &lt;span class=&quot;code&quot;&gt;Insert&lt;/span&gt; method in the generic &lt;span class=&quot;code&quot;&gt;Dictionary&lt;/span&gt; class revealed the following interesting snippet:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;  IL_001e:  ldfld      class System.Collections.Generic.IEqualityComparer`1&amp;lt;!0&amp;gt; class System.Collections.Generic.Dictionary`2&amp;lt;!TKey,!TValue&amp;gt;::comparer
  IL_0023:  ldarg.1
  IL_0024:  callvirt   instance int32 class System.Collections.Generic.IEqualityComparer`1&amp;lt;!TKey&amp;gt;::GetHashCode(!0)&lt;/pre&gt;
So my suspect, &lt;span class=&quot;code&quot;&gt;ObjectEqualityComparer&lt;/span&gt;, is an implementation of an &lt;span class=&quot;code&quot;&gt;IEqualityComparer&lt;/span&gt; interface, contained in the comparer field inside my &lt;span class=&quot;code&quot;&gt;Dictionary&lt;/span&gt; object. Here&#39;s what help file says about &lt;span class=&quot;code&quot;&gt;IEqualityComparer&lt;/span&gt;:&lt;br /&gt;
&lt;blockquote&gt;
This interface allows the implementation of customized equality comparison for collections. That is, you can create your own definition of equality for type &lt;span style=&quot;font-style: italic;&quot;&gt;T&lt;/span&gt;, and specify that this definition be used with a collection type that accepts the &lt;span style=&quot;font-weight: bold;&quot;&gt;IEqualityComparer&lt;/span&gt; generic interface. In the .NET Framework, constructors of the Dictionary generic collection type accept this interface.&lt;br /&gt;
&lt;br /&gt;
A default implementation of this interface is provided by the Default property of the EqualityComparer generic class. The StringComparer class implements &lt;span style=&quot;font-weight: bold;&quot;&gt;IEqualityComparer&lt;/span&gt; of type String. &lt;/blockquote&gt;
A bit more digging in help files revealed this:&lt;br /&gt;
&lt;blockquote&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Dictionary&lt;/span&gt; requires an equality implementation to determine whether keys are equal. You can specify an implementation of the IEqualityComparer generic interface by using a constructor that accepts a &lt;span style=&quot;font-style: italic;&quot;&gt;comparer&lt;/span&gt; parameter; if you do not specify an implementation, the default generic equality comparer EqualityComparer.Default is used. If type &lt;span style=&quot;font-style: italic;&quot;&gt;TKey&lt;/span&gt; implements the System.IEquatable generic interface, the default equality comparer uses that implementation.&lt;/blockquote&gt;
So how does the &lt;span class=&quot;code&quot;&gt;EqualityComparer.Default&lt;/span&gt; get defined? According to the help file, like this:&lt;br /&gt;
&lt;blockquote&gt;
The Default property checks whether type &lt;span style=&quot;font-style: italic;&quot;&gt;T&lt;/span&gt; implements the System.IEquatable generic interface and if so returns an &lt;span style=&quot;font-weight: bold;&quot;&gt;EqualityComparer&lt;/span&gt; that uses that implementation. Otherwise it returns an &lt;span style=&quot;font-weight: bold;&quot;&gt;EqualityComparer&lt;/span&gt; that uses the overrides of Object.Equals and Object.GetHashCode provided by &lt;span style=&quot;font-style: italic;&quot;&gt;T&lt;/span&gt;.&lt;/blockquote&gt;
&lt;br /&gt;
In other words, the boxing happens because my enum doesn&#39;t implement &lt;span class=&quot;code&quot;&gt;IEquatable&lt;/span&gt; interface. The funny thing is that &lt;span class=&quot;code&quot;&gt;Int32&lt;/span&gt; value type implements &lt;span class=&quot;code&quot;&gt;IEquatable&lt;/span&gt;, as does every other primitive numeric type. Yet enums don&#39;t. When you stop to think of it, it &lt;span style=&quot;font-style: italic;&quot;&gt;sounds&lt;/span&gt; logical: the &lt;span class=&quot;code&quot;&gt;Enum&lt;/span&gt; value type can&#39;t implement &lt;span class=&quot;code&quot;&gt;IEquatable&lt;/span&gt; because that would theoretically allow you to compare any enum to any other enum and obtain bogus results. Not that I&#39;m convinced by that argument, but it&#39;s really a moot point, seeing as how I can&#39;t change that myself. That leaves looking for a workaround, which brings us to the last stage.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Stage 3: The Lesser Evil&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
Once you&#39;re reduced to looking for a workaround, your job is to look for the least dirty and the least ugly solution. The first thing I tried was to make my enum implement the &lt;span class=&quot;code&quot;&gt;IEquatable&lt;/span&gt; interface. Naturally, there was no way to make C# swallow that syntax. Next I tried setting the &lt;span class=&quot;code&quot;&gt;EqualityComparer&amp;lt;TestEnum&amp;gt;.Default&lt;/span&gt; property, but unfortunately it&#39;s read-only. Finally, grudgingly, I conceded to write my own implementation of &lt;span class=&quot;code&quot;&gt;IEqualityComparer&lt;/span&gt;. The first attempt was to write it as a generic class, but when C# complained about not being able to explicitly cast an unknown value type to int (within my &lt;span class=&quot;code&quot;&gt;GetHashCode&lt;/span&gt; method) it finally drove the lesson home:&lt;a href=&quot;http://blogs.msdn.com/ericlippert/archive/2007/06/18/calling-static-methods-on-type-variables-is-illegal-part-two.aspx&quot;&gt; .NET generics are definitely not the same thing as C++ templates&lt;/a&gt;.&lt;br /&gt;
&lt;br /&gt;
The final solution looked like this:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;    class TestEnumComparer : IEqualityComparer&amp;lt;TestEnum&amp;gt;
    {
        public static readonly TestEnumComparer Instance = new TestEnumComparer();

        #region IEqualityComparer&amp;lt;TestEnum&amp;gt; Members

        public bool Equals(TestEnum x, TestEnum y)
        {
            return (x == y);
        }

        public int GetHashCode(TestEnum obj)
        {
            return (int) obj;
        }

        #endregion
    }&lt;/pre&gt;
The dictionary declaration would then look like this:&lt;br /&gt;
&lt;pre class=&quot;brush: csharp&quot;&gt;            Dictionary&amp;lt;TestEnum, int&amp;gt; dict = new Dictionary&amp;lt;TestEnum, int&amp;gt;(TestEnumComparer.Instance);&lt;/pre&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Finish Line&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
To sum it up, for each enum type you intend to use as a dictionary key, you&#39;ll have to write your implementation of &lt;span class=&quot;code&quot;&gt;IEqualityComparer&lt;/span&gt; and pass it to &lt;span class=&quot;code&quot;&gt;Dictionary&lt;/span&gt; constructor. For each enum type you&#39;ll have to type 18 lines of boilerplate code -- you can squeeze it into less, I know -- but that&#39;s a relatively small price to pay for avoiding the memory waste caused by boxing your keys. I just wish it wasn&#39;t that ugly, but at least it&#39;s simple enough. Stay tuned for more .NUTS in the future.</description><link>http://beardseye.blogspot.com/2007/08/nuts-enum-conundrum.html</link><author>noreply@blogger.com (Anonymous)</author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipRHy_z_Pop4FOQZLRX-oAmKYtnNcjOr8Nb3F4Fdyw7QTIwZbjHf2IWFJQEW2tAIjurFJyrN4_n6AuYszya_K_cfH2beosMXRLB0WIhay1stWgb_hixmwjyQY9lwHoe0rp-s09q_WKXtU/s72-c/enum_allocation_graph.PNG" height="72" width="72"/><thr:total>8</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-4112588951844519545</guid><pubDate>Fri, 01 Jun 2007 20:10:00 +0000</pubDate><atom:updated>2011-11-07T22:54:07.335-03:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">design patterns</category><category domain="http://www.blogger.com/atom/ns#">hype</category><category domain="http://www.blogger.com/atom/ns#">OOAD</category><category domain="http://www.blogger.com/atom/ns#">programming languages</category><title>Dark Side of Design Patterns</title><description>Every now and then I overhear someone excitedly talking to someone else about Design Patterns. They&#39;re so excited about it, you can hear the capital D and capital P quite clearly. And they use &lt;span style=&quot;font-style: italic;&quot;&gt;that&lt;/span&gt; tone: you know, the one usually reserved for great sports achievements. And, invariably, I can&#39;t resist the temptation to get involved. Things usually go downhill from there.&lt;br /&gt;
&lt;br /&gt;
I can understand their chagrin. How would you like it if you just discovered something cool only to have some spoilsport tell you that roughly half of it is crap? Yet I just can&#39;t keep quiet; partly because I&#39;m mean and love seeing their reactions, but mostly because I&#39;m sick of the hype.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Patternitis&lt;/span&gt;&lt;br /&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;span style=&quot;font-size: 78%; font-style: italic;&quot;&gt;&quot;The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts.&quot;&lt;/span&gt;&lt;/div&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;span style=&quot;font-size: 78%; font-style: italic;&quot;&gt;Bertrand Russell&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;
Don&#39;t get me wrong, I think design patterns are a great idea. Hardly a new one, but great nevertheless. The problem is that every Great Idea attracts its share of fanatics who think it&#39;s the coolest thing since sliced bread. It gets paraded around as a Silver Bullet and a Golden Hammer until its limitations percolate through the layers of hype. Inevitably, the general hype dies down -- if we&#39;re talking about a genuine Great Idea, not glossyware stuff like SOA -- and all that&#39;s left  are pockets of hype that form about enthusiastic newbies or incompetent know-it-alls.&lt;br /&gt;
&lt;br /&gt;
I personally love stamping out those little pockets and that&#39;s what I want to do with this blog post. I&#39;ll start by asserting the following:&lt;br /&gt;
&lt;ol&gt;
&lt;li&gt;If you think that developing software can be reduced to applying and combining design patterns, you are &lt;span style=&quot;font-weight: bold;&quot;&gt;wrong&lt;/span&gt;.&lt;/li&gt;
&lt;li&gt;If you think that a design pattern you don&#39;t completely understand must nevertheless be good, you are &lt;span style=&quot;font-weight: bold;&quot;&gt;wrong.&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;If you think that design patterns are a new invention made possible by OOP, you are &lt;span style=&quot;font-weight: bold;&quot;&gt;wrong&lt;/span&gt;.&lt;/li&gt;
&lt;li&gt;If you think that someone is a good programmer just because they know design patterns, you are &lt;span style=&quot;font-weight: bold;&quot;&gt;wrong&lt;/span&gt;.&lt;/li&gt;
&lt;li&gt;If you think that you must know design patterns to be a good programmer, you are &lt;span style=&quot;font-weight: bold;&quot;&gt;wrong&lt;/span&gt;.&lt;/li&gt;
&lt;/ol&gt;
Now that the most common misconceptions are out of our way, we can get to separating the wheat from the chaff.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Gold vs. Other Glittery Stuff&lt;/span&gt;&lt;br /&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;span style=&quot;font-size: 78%;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;&quot;Any man whose errors take ten years to correct is quite a man.&quot;&lt;/span&gt;&lt;/span&gt;&lt;br /&gt;
&lt;span style=&quot;font-size: 78%;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;J. Robert Oppenheimer&lt;/span&gt;&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;
What is a design pattern? The original book by Gamma, Helm, Johnson and Vlissides (popularly known as GoF) takes a page and a half to answer this question. I prefer a shorter and simpler answer, even if it lacks all the accuracy of the original. Wikipedia has pretty good definition:&lt;br /&gt;
&lt;blockquote&gt;
In software engineering (or computer science), a design pattern is a general repeatable solution to a commonly occurring problem in software design. A design pattern is not a finished design that can be transformed directly into code. It is a description or template for how to solve a problem that can be used in many different situations. Object-oriented design patterns typically show relationships and interactions between classes or objects, without specifying the final application classes or objects that are involved. Algorithms are not thought of as design patterns, since they solve computational problems rather than design problems.&lt;/blockquote&gt;
Still a bit long for my taste, but it&#39;s quite usable. As a matter of fact, the first sentence is the most important; the rest just helps cement the scope.&lt;br /&gt;
&lt;br /&gt;
The GoF book defines 23 design patterns: 5 creational, 7 structural and 11 behavioral. It took me a lot of conscious effort and quite a few years of working experience to admit that not all of those patterns are gold. In fact, the wheat-to-chaff ratio is quite close to 50 percent: there are 12 that I could describe as &quot;okay&quot;, &quot;good&quot; or &quot;great&quot; and 11 that I see as &quot;hack&quot;, &quot;poor&quot; or even &quot;rubbish&quot;.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;The Good&lt;/span&gt;&lt;br /&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;span style=&quot;font-size: 78%;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;&quot;The remarkable thing about Shakespeare is that he really is very good, in spite of all the people who say he is very good.&quot;&lt;/span&gt;&lt;br /&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;Robert Graves&lt;/span&gt;&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;
If I learned something from my previous managers, it&#39;s to always dangle the carrot before using the stick. Seriously, though, the original GoF book has some really good patterns in it.&lt;br /&gt;
&lt;br /&gt;
Take Bridge, for example. The best thing about this pattern is that it teaches you about the separation of concerns. You can adapt it to classless object-oriented languages, you can even apply it in structural programming languages. The main point is to keep the implementation primitives separate from the operations that use them.&lt;br /&gt;
&lt;br /&gt;
Builder is another example of a highly useful pattern. Just look at SAX: they provide the director and you have to write the builder. Highly efficient and very elegant. By the way, I really recommend reading the original GoF book if you want to understand this pattern. Some of the other books attempt to explain it by applying it to convoluted problems such as constructing a MIME message or building a user interface. A word of advice to the authors: KISS.&lt;br /&gt;
&lt;br /&gt;
A few other great patterns include Command, Composite and Template Method. These really make the best out of OOP and it shows. These are all nice examples of taking advantage of the benefits of OOP.&lt;br /&gt;
&lt;br /&gt;
Then there is a whole range of patterns that have been around for a long time, long before OOP and long before GoF invented design patterns as we know them: Adapter, Façade, Flyweight, Chain of Responsibility and Observer.&lt;br /&gt;
&lt;br /&gt;
Finally, I reserve a special place for Mediator, not because it&#39;s better than the rest, but because it&#39;s a perfect example of a good pattern that can be horribly abused. One word: Delphi. The &quot;RAD paradigm&quot; promoted by Delphi is to add objects to their containers and then write all the logic in a few huge Mediator classes. Delphi is based on an object-oriented language, has an excellent core framework and class library and yet it teaches its users to forget OOP. Bottom line: no matter how cool the pattern, don&#39;t overdo it.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;The Ugly&lt;/span&gt;&lt;br /&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;span style=&quot;font-size: 78%;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;&quot;Beauty is the first test; there is no permanent place in the world for ugly mathematics.&quot;&lt;/span&gt;&lt;/span&gt;&lt;br /&gt;
&lt;span style=&quot;font-size: 78%;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;G.H. Hardy&lt;/span&gt;&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;
We have all, at some point in our lives, had to choose the lesser of two evils. That&#39;s most likely what you will be doing when you decide to apply one of the following: Proxy, Memento, Visitor, Decorator or State. All of these are a cumbersome solution for something that should have been supported or aided by the language itself.&lt;br /&gt;
&lt;br /&gt;
Proxy, for example, is a necessary and useful pattern. The problem with it is that a lot of the languages require you to implement it by writing tons of boilerplate code. With some good metaprogramming facilities, Proxy becomes trivial.&lt;br /&gt;
&lt;br /&gt;
Likewise, the motivation behind Memento is valid, but it should be solved by the language or its core library. Serializing and deserializing objects should be trivial, which means you shouldn&#39;t have to write gobs of code to do it. Look at the way you do it in Java:&lt;br /&gt;
&lt;pre class=&quot;brush: java&quot;&gt;ByteArrayOutputStream baos = new ByteArrayOutputStream();
ObjectOutputStream oos = new ObjectOutputStream(oos);
try {
  oos.writeObject(originator);
  oos.close();
} catch (IOException e) {
  // won&#39;t happen but we _still_ 
  // have to write code to ignore it
}
byte[] memento = baos.toByteArray();&lt;/pre&gt;
Now compare it to Ruby:&lt;br /&gt;
&lt;pre class=&quot;brush: ruby&quot;&gt;memento = Marshal.dump(originator)&lt;/pre&gt;
I rest my case.&lt;br /&gt;
&lt;br /&gt;
Visitor is another example of hacking your way around language limitations. Most of the object-oriented code anyone writes needs only single dispatch. You can argue that supporting double or multiple dispatch in the language is wasteful, although there are &lt;a href=&quot;http://www.mip.sdu.dk/%7Ebnj/library/chambers99efficient.pdf&quot;&gt;ways&lt;/a&gt; around that. But when your language leaves you with no other choice and you need double dispatch, you will have to use Visitor. If you happen to need multiple dispatch, good luck writing and maintaining that code.&lt;br /&gt;
&lt;br /&gt;
Finally, we come to Decorator and State. Both of these patterns are useful and both deal with the same issue: sometimes we want to adapt the behavior of individual objects depending on their state or some other trait we want to manipulate during run time. I believe that it should be possible without writing all the boilerplate code that you need for Decorator and State. I&#39;ve learned and experimented with Self and Io languages and they both allow you to solve these needs very elegantly. Again, look at a sample implementation of State pattern in Java:&lt;br /&gt;
&lt;pre class=&quot;brush: java&quot;&gt;public class Bot {
  
  private int health;
  private BotState state;
  
  protected void changeState(BotState newState) {
    state = newState;
  }
  
  protected static final BotState HEALTHY = new HealthyState();
  protected static final BotState INJURED = new InjuredState();
  
  protected abstract static class BotState {
    public void takeDamage(Bot bot, int damage) {
      bot.health -= damage;
    }
    
    public void heal(Bot bot, int delta) {
      bot.health += delta;
    }
    
    public void enemySpotted(Bot bot, Object enemy) {}
  }
  
  protected static class HealthyState extends BotState {
    public void takeDamage(Bot bot, int damage) {
      super.takeDamage(bot, damage);
      if (bot.health &amp;lt;= 40) {
        bot.changeState(INJURED);
      }
    }
    
    public void enemySpotted(Bot bot, Object enemy) {
      bot.chase(enemy);
    }
  }
  
  protected static class InjuredState extends BotState {
    public void heal(Bot bot, int delta) {
      super.heal(bot, delta);
      if (bot.health &amp;gt;= 50) {
        bot.changeState(HEALTHY);
      }
    }
    
    public void enemySpotted(Bot bot, Object enemy) {
      bot.avoid(enemy);
    }
  }
  
  public void chase(Object enemy) {
    System.out.print(&quot;Chasing &quot; + enemy);
  }
  
  public void avoid(Object enemy) {
    System.out.print(&quot;Avoiding &quot; + enemy);
  }
  
  public void takeDamage(int damage) {
    state.takeDamage(this, damage);
  }
  
  public void heal(int delta) {
    state.heal(this, delta);
  }
  
  public void enemySpotted(Object enemy) {
    state.enemySpotted(this, enemy);
  }
}&lt;/pre&gt;
Now compare it with the same thing in Io:&lt;br /&gt;
&lt;pre class=&quot;brush: js&quot;&gt;Bot := Object clone do (
  init := method(self health := 100)
  chase := method(enemy, (&quot;Chasing &quot; .. enemy) println)
  avoid := method(enemy, (&quot;Avoiding &quot; .. enemy) println)
  takeDamage := method(damage, health = health - damage)
  heal := method(delta, health = health + delta)
  enemySpotted := method(enemy, nil)
  changeState := method(newState, self setProto(newState))
)

HealthyBot := Bot cloneWithoutInit do (
  takeDamage := method(damage,
    resend
    if (health &amp;lt;= 40, changeState(InjuredBot))
  )
  enemySpotted := method(enemy, chase(enemy))
)

InjuredBot := Bot cloneWithoutInit do (
  heal := method(delta,
    resend
    if (health &amp;gt;= 50, changeState(HealthyBot))
  )
  enemySpotted := method(enemy, avoid(enemy))
)&lt;/pre&gt;
I would even settle for hard, specific syntax supported by the language itself, like in &lt;a href=&quot;http://wiki.beyondunreal.com/wiki/UnrealScript_Language_Reference/States&quot;&gt;UnrealScript&lt;/a&gt;.&lt;br /&gt;
&lt;br /&gt;
Bottom line? Our languages really need to evolve. Combine prototype-based OOP that you can find in Self and Io with excellent metaprogramming offered by Lisp and you will have a language that kicks ass and doesn&#39;t need ugly patterns.&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;The Bad&lt;/span&gt;&lt;br /&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;span style=&quot;font-size: 78%;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;&quot;There is no stigma attached to recognizing a bad decision in time to install a better one.&quot;&lt;/span&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;Laurence J. Peter&lt;/span&gt;&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;
Thus we come to the place where the bad things dwell. There&#39;s a variety of patterns that fall into this group and they do so for diverse reasons. Let&#39;s start with a couple that gave me nightmares for a long while.&lt;br /&gt;
&lt;br /&gt;
Factory Method sounds like a good idea. Upon close examination, Abstract Factory is revealed to be Factory Method applied to Factory Method, which makes it sound even better. Only those who have been fooled by this pattern can appreciate the despair and the subsequent numbness brought on by writing tons and tons of the &lt;a href=&quot;http://en.wikipedia.org/wiki/Abstract_factory_pattern#Examples&quot;&gt;boilerplate code&lt;/a&gt; required by it. The worst thing is that it seems to be the only solution. Until you find out about &lt;a href=&quot;http://www.martinfowler.com/articles/injection.html&quot;&gt;Dependency Injection&lt;/a&gt;. Try it and you&#39;ll see the difference.&lt;br /&gt;
&lt;br /&gt;
Another pattern that really fouls things up is Singleton. For a long time I couldn&#39;t find one good reason why I should have a class with only one instance, instead of making all that code static. Then someone asked what I would do if I had to pass an interface to it. Okay, that justifies it, I can even think of examples other than boilerplate Abstract Factory code. Another valid point is that you might want to switch from one instance to a pool and you don&#39;t want to rewrite the bulk of your code. Then there is the case mentioned in the GoF book that talks about subclassing the singleton and creating a registry of singletons. If you think that things are getting out of hand, you&#39;re right. The solution is the same as above: use Dependency Injection.&lt;br /&gt;
&lt;br /&gt;
Then there&#39;s Iterator pattern. That one should be easy: use the foreach statement. And if your language doesn&#39;t have it, ask for your time back. It&#39;s simply unacceptable to have to write reams of ugly code for something we use so often. It deserves its own syntax.&lt;br /&gt;
&lt;br /&gt;
What about Strategy pattern?  According to the GoF and to all subsequent authors, you should use it to encapsulate a family of algorithms and make them interchangeable. That&#39;s about one thing for which I would &lt;span style=&quot;font-weight: bold;&quot;&gt;not&lt;/span&gt; use it. If you want to do that use functions. Again, if you don&#39;t have them, ask for your time back. On the other hand, you could use Strategy pattern when your implementation strategy has to maintain state over time. But then it&#39;s not really Strategy, is it? Slippery things, these patterns.&lt;br /&gt;
&lt;br /&gt;
Finally, there&#39;s Interpreter. I don&#39;t have any beef with its motivation: just look at how things have been evolving and you&#39;ll see several examples of minilanguages, scripting languages and XML-based stuff, all getting interpreted in the normal course of execution of your code. From printf format string of olden to XML configuration files of today, the idea has been in use for a loooong time. Thing is, this isn&#39;t a design pattern. It doesn&#39;t fit into the scope of a design pattern. And you won&#39;t make it fit by sketching out your favorite implementation of an interpreter for your favorite minilanguage. It just does not belong.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Patternitis Revisited&lt;/span&gt;&lt;br /&gt;
&lt;div style=&quot;text-align: right;&quot;&gt;
&lt;span style=&quot;font-size: 78%;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;&quot;When people are free to do as they please, they usually imitate each other.&quot;&lt;/span&gt;&lt;/span&gt;&lt;br /&gt;
&lt;span style=&quot;font-size: 78%;&quot;&gt;&lt;span style=&quot;font-style: italic;&quot;&gt;Eric Hoffer&lt;/span&gt;&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;
I have to give it to GoF though, they had some great ideas and the only time their scope slipped was with the Interpreter pattern. Sadly, the same couldn&#39;t be said about the later authors. I&#39;ll say it out loud: interface is &lt;span style=&quot;font-weight: bold;&quot;&gt;not&lt;/span&gt; a design pattern. There are many things you could say about an interface. From a conceptual point of view, it&#39;s the definition of the contract your code will have to satisfy. From a language-specific point of view, it&#39;s a syntactic and semantic construct. But it is definitely not a design pattern. Neither is a symbolic constant name or an accessor method name.&lt;br /&gt;
&lt;br /&gt;
It might seem that I&#39;m beating a dead horse here, but I want to make it absolutely clear that not everything in software development is a design pattern. It&#39;s too easy to lose focus in enthusiasm, so do yourself a favor: learn the design patterns, keep using the ones you like (where applicable) and move on. There&#39;s a lot more to do.</description><link>http://beardseye.blogspot.com/2007/06/dark-side-of-design-patterns.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>0</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-6313444746732727007</guid><pubDate>Tue, 06 Feb 2007 14:16:00 +0000</pubDate><atom:updated>2008-05-15T22:10:36.041-04:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">education</category><category domain="http://www.blogger.com/atom/ns#">paradigms</category><category domain="http://www.blogger.com/atom/ns#">society</category><title>Uneducational Crisis Pt. 3: Approach</title><description>Most of the things I wrote in the previous two posts can be disputed with counter-examples: not every professor is a &lt;span class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_0&quot;&gt;Prima&lt;/span&gt; Donna; not all things they teach you at the university are irrelevant; there are great places where smart and likable people teach you good stuff; everything else is just &quot;bad luck&quot; and you &quot;have to take the good with the bad&quot;. Let me tell you what I think about that:&lt;br /&gt;&lt;br /&gt;&lt;div style=&quot;text-align: center;&quot;&gt;&lt;span style=&quot;font-weight: bold; font-style: italic;&quot;&gt;I don&#39;t buy it.&lt;/span&gt;&lt;br /&gt;&lt;/div&gt;&lt;br /&gt;There&#39;s a big problem with education as a system in general, not just in the CS field. And that problem involves the relevance of what we learn, from kindergarten to university. How many things did you learn in the ground school that you have completely forgotten? Let&#39;s face it, as a society we are doing a shoddy work of teaching our kids.&lt;br /&gt;&lt;br /&gt;Why is education so bad? There is probably a variety of factors that enter the whole equation, but I think there are only two of major importance. One of them is the fact that teaching is not the number one motivation behind the existence of schools. Paul Graham has put it succinctly in his essay &quot;&lt;a href=&quot;http://www.paulgraham.com/nerds.html&quot;&gt;Why Nerds are Unpopular&lt;/a&gt;&quot;:&lt;br /&gt;&lt;blockquote&gt;&lt;span style=&quot;;font-family:verdana;font-size:85%;&quot;  &gt;Teenagers now are useless, except as cheap labor in industries like fast food, which evolved to exploit precisely this fact. In almost any other kind of work, they&#39;d be a net loss. But they&#39;re also too  young to be left unsupervised. Someone has to watch over them, and the most efficient way to do this is to collect them together in one place. Then a few adults can watch all of them.&lt;/span&gt;&lt;/blockquote&gt;The other major factor is that the basic approach to teaching is wrong. And that&#39;s what I want to explain here, so let&#39;s delve into it.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;What Do You Want to Be When You Grow Up?&lt;br /&gt;&lt;br /&gt;&lt;/span&gt;Whenever someone asks me how I came to be so obsessed with computers and programming, I tell them to blame my dad. No, he didn&#39;t teach me how to code, nor did he teach me anything important about the computers beyond turning them on and off and washing my hands before touching the keyboard. In fact, he can&#39;t code to save his life, which is perfectly fine, because he&#39;s a writer. What he did do, though, is much more important: he introduced me to computers.&lt;br /&gt;&lt;br /&gt;How did he know I would like computers? That&#39;s the whole point: he didn&#39;t. When I was 5 or 6 years old -- I don&#39;t remember clearly anymore -- he started showing me the basics of photography. I found it entertaining, but I didn&#39;t take it too seriously. Then he tried with cinematography and the results were more or less the same. But then he showed me a computer and since that first look I am completely and totally fascinated with computers and the programs they run.&lt;br /&gt;&lt;br /&gt;Why am I telling you this? Because I think I was extremely lucky. From that moment on, I knew what I wanted to be when I grow up; I had an interest that didn&#39;t pass and fade and I always had a new goal to achieve and a new way to improve myself.&lt;br /&gt;&lt;br /&gt;I do know a few other people who have a passion in their life. I also know many more who never discovered anything they really like and lead largely disoriented lives. It sounds harsh, but it&#39;s true: having a spouse and kids cannot replace the feeling of having your own dreams. You can share your dreams with your family and complement one with the other, but in the end they are not interchangeable.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;&lt;span class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_1&quot;&gt;Nosce&lt;/span&gt; Te &lt;span class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_2&quot;&gt;Ipsum&lt;/span&gt;&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Here&#39;s the interesting part: I think everyone can and should have the chance to discover what they like. That&#39;s one of the most important things in our personal development and we usually don&#39;t have much help with it.&lt;br /&gt;&lt;br /&gt;That&#39;s why most of the stuff they teach us at school is useless and irrelevant. How do you know something is relevant? You don&#39;t. You have to discover it. It&#39;s not something you can be taught, it&#39;s something you have to find out yourself. But that doesn&#39;t mean you shouldn&#39;t have help.&lt;br /&gt;&lt;br /&gt;Educational institutions are ostensibly there to impart knowledge we will need in order to be useful and productive members of the society. But we don&#39;t know where and how we fit into that society. And the society doesn&#39;t know it either, so they just try to pour all sorts of stuff on our developing minds, hoping that after a while we&#39;ll get an idea where to head and how to specialize.&lt;br /&gt;&lt;br /&gt;Optimism is good, but that&#39;s not optimistic: it&#39;s plain ridiculous. Imagine you had to pick what kind of food you want for your wedding party and the way to choose is to first learn by heart the recipes for a whole lot of different dishes and then use that to decide.&lt;br /&gt;&lt;br /&gt;Yes, you will have an idea what each dish tastes like and yes, you might even make a dish or two to check out the taste. You might even get lucky and choose well. But if you made a mistake, there&#39;s no turning back: your wedding party will have food you don&#39;t like and that&#39;s it. You might get married again later and have another wedding party, but this one is gone for good. And the food sucked.&lt;br /&gt;&lt;br /&gt;Of course, in real life you can get help from dedicated experts who specialize in that sort of stuff.&lt;br /&gt;Imagine if you could get help with discovering your own talents and interests. Imagine if you could get that kind of help from people who specialize in it. Wouldn&#39;t life be a whole lot nicer?&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Rat Races&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Instead of that, what do we have now? We have to assimilate a considerable &lt;span class=&quot;blsp-spelling-corrected&quot; id=&quot;SPELLING_ERROR_3&quot;&gt;quantity&lt;/span&gt; of knowledge in a fixed time period, at the end of which our performance is rated and, if it is found satisfactory we can proceed to the next period and the next batch of knowledge.&lt;br /&gt;&lt;br /&gt;There&#39;s a name you can slap on that description: competition. Our educational paradigm is based on competition and that makes it not merely misguided, but downright harmful. Instead of helping to find what our kids can do well, we force them to become &quot;good enough&quot; at doing a bit of everything. Worse, after passing through our educational maze, the kids are still not really good at doing anything of real worth.&lt;br /&gt;&lt;br /&gt;Consider the most notable effect of our educational model: cheating. I really like how &lt;span class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_4&quot;&gt;Wikipedia&lt;/span&gt; hit the nail on the head in their &lt;a href=&quot;http://en.wikipedia.org/wiki/Cheating&quot;&gt;article&lt;/a&gt;:&lt;br /&gt;&lt;blockquote&gt;A common venue for cheating is in education settings, where it takes a number of forms.&lt;/blockquote&gt;Of course it does. If you&#39;re competing for something that could bring you a gain you can clearly &lt;span class=&quot;blsp-spelling-corrected&quot; id=&quot;SPELLING_ERROR_5&quot;&gt;perceive&lt;/span&gt;, you might be tempted to cheat. But if you&#39;re forced to compete at something that you don&#39;t even &lt;span class=&quot;blsp-spelling-corrected&quot; id=&quot;SPELLING_ERROR_6&quot;&gt;perceive&lt;/span&gt; as lucrative (and nobody else really believes it either), you won&#39;t be &lt;span style=&quot;font-style: italic;&quot;&gt;tempted&lt;/span&gt; to cheat, you will consider it a &lt;span style=&quot;font-style: italic;&quot;&gt;natural&lt;/span&gt; option.&lt;br /&gt;&lt;br /&gt;I&#39;m not against the competition in education. Competition can be good, because it fosters a drive towards self-improvement. But you have to &lt;span style=&quot;font-style: italic;&quot;&gt;want&lt;/span&gt; to compete. And that happens either when you&#39;re competing at something you like or for something you &lt;span class=&quot;blsp-spelling-corrected&quot; id=&quot;SPELLING_ERROR_7&quot;&gt;perceive&lt;/span&gt; as lucrative.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Silver Bullet?&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Of course, the solution is certainly a lot harder to implement than to propose. And I didn&#39;t even propose a complete solution, I just gave an idea. And I&#39;m probably not the first to have it or even to voice it. And it certainly wouldn&#39;t solve all the problems and cure all the diseases and stop all the wars.&lt;br /&gt;&lt;br /&gt;But don&#39;t you think it would be nice to try it?</description><link>http://beardseye.blogspot.com/2007/02/uneducational-crisis-pt-3-approach.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>0</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-4136020224056684159</guid><pubDate>Fri, 19 Jan 2007 12:41:00 +0000</pubDate><atom:updated>2008-05-15T22:10:08.023-04:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">education</category><category domain="http://www.blogger.com/atom/ns#">society</category><category domain="http://www.blogger.com/atom/ns#">SSADM</category><category domain="http://www.blogger.com/atom/ns#">UML</category><title>Uneducational Crisis Pt. 2: Subject Matter</title><description>When I think about these last four years, I don&#39;t usually think about the university. Much more important and interesting things have been happening in my life. I think that says something about the quality of education as a life experience: you only consider it important when you&#39;re forced to do it during the better part of your day.&lt;br /&gt;&lt;br /&gt;But when I do think about the university, the first thing that comes to mind are the professors. I&#39;ve had a few excellent professors who were a pleasure to study with, but the rest ranged from merely disinterested to downright inept. In my &lt;a href=&quot;http://beardseye.blogspot.com/2007/01/uneducational-crisis-1-human-factor.html&quot;&gt;previous post&lt;/a&gt;, I covered this topic in detail.&lt;br /&gt;&lt;br /&gt;The second thing that comes to mind is what I learned. Or rather, what I was &lt;span style=&quot;font-style: italic;&quot;&gt;supposed&lt;/span&gt; to learn. Now that the things are drawing to their unceremonious end, it&#39;s amazing to look back and see just how many things I&#39;ve been told were&lt;br /&gt;&lt;ul&gt;&lt;li&gt;misleading&lt;/li&gt;&lt;li&gt;wrong&lt;/li&gt;&lt;li&gt;obsolete&lt;/li&gt;&lt;li&gt;stupid&lt;/li&gt;&lt;li&gt;some or all of the above&lt;/li&gt;&lt;/ul&gt;With a lot of real-life experience in software industry, you can afford to be told such things because your brain will reject them automatically after a brief examination. But the reason I started writing about the educational problems in CS is not because I feel cheated out of my money, but because of the kids who go to the university to try to learn.&lt;br /&gt;&lt;br /&gt;You see, I love programming and I like doing it for a living. And I enjoy the company of other people who do it. Sure, school might not matter if you&#39;re good enough (and infatuated enough). The knowledge is out there and you can learn it all by yourself. But you might not find it soon enough and you might get the wrong ideas at school and you might come out of it all utterly confused or frustrated or suffer some other damage. There&#39;s enough stuff in real life that will confuse and frustrate you, no need to start out like that.&lt;br /&gt;&lt;br /&gt;So what&#39;s wrong with the stuff they teach you at the university?&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Obsolescence&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;I agree that learning about the past is important. You need some context if you want to understand not only the &lt;span style=&quot;font-style: italic;&quot;&gt;how&lt;/span&gt;, but also the &lt;span style=&quot;font-style: italic;&quot;&gt;why&lt;/span&gt; of the things you see today. But that&#39;s no excuse for teaching obsolete stuff.&lt;br /&gt;&lt;br /&gt;Take waterfall model, for example. For a big enough project, as most of real-world projects are, it simply does not work -- there are too many things that can go wrong and &lt;span style=&quot;font-style: italic;&quot;&gt;will&lt;/span&gt; go wrong. And yet the universities insist on teaching it as &lt;span style=&quot;font-weight: bold;&quot;&gt;the&lt;/span&gt; way to execute software development projects.&lt;br /&gt;&lt;br /&gt;Another good example is the insistence on Design Flow Diagrams. I&#39;m not implying that &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_0&quot;&gt;DFDs&lt;/span&gt; are obsolete. Nor am I saying the same about any other concept from &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_1&quot;&gt;SSADM&lt;/span&gt;. Had any of our professors bothered to teach us &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_2&quot;&gt;SSADM&lt;/span&gt; well, I might have learned enough to compare it to what I&#39;m used to working with.&lt;br /&gt;&lt;br /&gt;As far as &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_3&quot;&gt;DFDs&lt;/span&gt; go, I find them stunningly imprecise and uncomfortable, compared to &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_4&quot;&gt;UML&lt;/span&gt;. This, of course, is a topic for a discussion that goes beyond the scope of this post, but it&#39;s also a bit beside the point. The point is that if the course is called Systems Analysis and Design, I don&#39;t think it can stay locked in time forever, covering &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_5&quot;&gt;SSADM&lt;/span&gt; in depth and merely mentioning the existence of &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_6&quot;&gt;UML&lt;/span&gt; and iterative processes and the myriad of other techniques that evolved in the mean time.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Scope&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;The problem with &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_7&quot;&gt;SSADM&lt;/span&gt; vs. &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_8&quot;&gt;UML&lt;/span&gt; is just an specific case of a bigger and more general problem: the scope is all wrong. If you are going to give an &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_9&quot;&gt;OOP&lt;/span&gt; course, you should take care to explain what &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_10&quot;&gt;OOP&lt;/span&gt; is, what &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_11&quot;&gt;OOP&lt;/span&gt; isn&#39;t, why it&#39;s been invented, how it works, etc. Sounds reasonable, but it doesn&#39;t happen. What you get instead is a Java course, where in the first few classes you get a very brief, confused and inaccurate summary of &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_12&quot;&gt;OOP&lt;/span&gt; concepts and then you delve into the language specifics.&lt;br /&gt;&lt;br /&gt;I love learning new languages. As Kevin &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_13&quot;&gt;Kelleher&lt;/span&gt; notes, each language tries to &lt;a href=&quot;http://paulgraham.com/fix.html&quot;&gt;solve some problem&lt;/a&gt;. That means that each new language you learn might teach you about a new class of problems or a new class of solutions. The thing is, you have to realize what the problems are and why they are problems before you can benefit from learning a new language. Otherwise you&#39;re just learning syntax.&lt;br /&gt;&lt;br /&gt;You could do it the way I did it: learn a new language, compare it with the one you knew before and gradually become aware of the new problems. After enough time and enough new languages, you&#39;ll be able to think of languages in terms of problems and solutions. Indeed, nothing can completely replace the process of learning from your own experience. But that doesn&#39;t mean you aren&#39;t entitled to decent formal education.&lt;br /&gt;&lt;br /&gt;It&#39;s like your parents telling you not to play with fire. You might heed their advice and still get burned by something you didn&#39;t know was hot. Or you might disregard their advice and get burned. But if they didn&#39;t tell you not to play with fire you would surely get burned out of ignorance.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Specialization&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;The problem of scope is what you&#39;ll see on the individual course level. But when you take a step back and examine the entire course curriculum, you&#39;ll find yourself facing the opposite extreme: breadth. The sheer breadth of subjects covered combined with the scope of each course makes you wonder why you got yourself into this at all.&lt;br /&gt;&lt;br /&gt;Here&#39;s a bit of news for the guys who design the educational program: most of the software developers don&#39;t need to know about the implementation specifics of the &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_14&quot;&gt;datalink&lt;/span&gt; layer in &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_15&quot;&gt;ISDN&lt;/span&gt;. Nor do most network experts really need to know how to design a database in the third normal form.&lt;br /&gt;&lt;br /&gt;I understand why the universities do this. It&#39;s a lot more manageable to have your courses flow sequentially and give everyone who survives the same diploma. Teach them all and let job sort them out. The question is, should ease of management excuse the poor results? Let&#39;s put it to a test: what would you think of a manager who told you, before the project started, that you&#39;ll have to implement a web-based user interface, a Swing GUI and an interactive textual interface, because it&#39;s easier to plan the project that way than to plan for capturing the &lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_16&quot;&gt;UI&lt;/span&gt; requirements?&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Relevance&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;All of the problems I&#39;ve described here could be easily summarised in one word: relevance. Obsolete subjects are irrelevant to what you&#39;ll be doing. Courses with wrong scope teach you irrelevant things. Lack of specialization ensures you&#39;ll be learning a lot of irrelevant stuff. I postulate that every problem with the subject matter in CS education can be filed under relevance.&lt;br /&gt;&lt;br /&gt;So how do we solve this? I hate doing this, but I&#39;m afraid I have to do a Robert Jordan and leave you hanging. By now I&#39;ve built enough context to engage in a discussion that I wanted to start from the beginning, but that discussion merits its own post. Therefore, I have no choice but to say &lt;a href=&quot;http://www.dragonmount.com/Faq/index.php?sid=233219&amp;amp;lang=en&amp;amp;amp;action=artikel&amp;amp;cat=2&amp;amp;id=7&amp;amp;artlang=en&quot;&gt;&lt;span onclick=&quot;BLOG_clickHandler(this)&quot; class=&quot;blsp-spelling-error&quot; id=&quot;SPELLING_ERROR_17&quot;&gt;RAFO&lt;/span&gt;&lt;/a&gt;.&lt;br /&gt;&lt;br /&gt;At least I promise to finish in the third post.</description><link>http://beardseye.blogspot.com/2007/01/uneducational-crisis-pt-2-subject.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>0</thr:total></item><item><guid isPermaLink="false">tag:blogger.com,1999:blog-1593691839906710481.post-8610098139747791127</guid><pubDate>Tue, 16 Jan 2007 17:37:00 +0000</pubDate><atom:updated>2008-05-15T22:09:38.396-04:00</atom:updated><category domain="http://www.blogger.com/atom/ns#">education</category><category domain="http://www.blogger.com/atom/ns#">incompetence</category><category domain="http://www.blogger.com/atom/ns#">society</category><title>Uneducational Crisis Pt. 1: Human Factor</title><description>If there&#39;s one thing worse than a bad programmer, it&#39;s a failed programmer turned professor. I don&#39;t know how common this is in United States, but here in Chile you see it a lot. And try as I might to pretend that I don&#39;t care, it&#39;s actually driving me nuts. Quite understandably, too: I could be doing interesting things instead of trying to guess what the latest in a row of brain-dead dilettantes wants me to write as a solution to his &quot;ingenious&quot; exam exercise.&lt;br /&gt;&lt;br /&gt;On the other hand, this way I get the first-hand insight into the problems with education in CS field. Some of these problems, I suppose, are local and owe their existence to a variety of economic, political and technological factors, but I&#39;m convinced that some are universal. Nonetheless, I shall not discriminate: I&#39;ll go over them all and a plague on both their houses.&lt;br /&gt;&lt;br /&gt;This article, the first in the series, deals with the professors themselves.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Two Roads Diverged in a Yellow Wood...&lt;br /&gt;&lt;br /&gt;&lt;/span&gt;Over the time, I learned to distinguish two types of CS professors:&lt;br /&gt;&lt;ol&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;The Geek.&lt;/span&gt; This guy knows his stuff, because he actually works in computer industry and does his job well. Sometimes he is motivated by a genuine desire to teach new generations, but often he is teaching just to make a bit of money on the side or to lend a bit of academic weight to his name. More often than not, he is wonderfully inept at imparting his knowledge to the students, because he&#39;s a computer geek, not a teacher. All in all, he&#39;s guaranteed to be relaxed, easy-going and flexible. Mostly harmless.&lt;br /&gt;&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;The Prima Donna.&lt;/span&gt; Who needs knowledge when you have Orwellian authority? This is the guy who will typically boast at least 20 years of experience in computer industry, who has a prominent consulting company and is certified by Carnegie Mello [sic] University. His classes are pure poetry. Vogon poetry, to be precise.&lt;/li&gt;&lt;/ol&gt;The Prima Donna is the type that does the most damage, so he merits detailed dissection.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Spot the Prima Donna&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;A Prima Donna professor is rather easy to recognize. He will exhibit a combination of several seamlessly coupled behavioral patterns from the following list:&lt;br /&gt;&lt;ul&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Smugness.&lt;/span&gt; Usually one of the first signs that you&#39;re dealing with a Prima Donna is that self-satisfied attitude aimed to make you feel that some day, with lots of effort and a divine intervention or two, you&#39;ll get to be as good as this guy.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Hand Waving.&lt;/span&gt; One of the most valuable techniques for explaining the unknown is the hand waving. To those who came to learn, it seems that the Prima Donna is making a herculean effort to explain the concept and that they are simply too dumb to understand him.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Stating the Obvious. &lt;/span&gt; A good presentation slide will show only the key concepts and leave the explanation to the presenter. A bad presentation slide will show all the information in a pile of text you won&#39;t bother to read. In both cases, the Prima Donna will repeat the contents of the slide, with lots of Hand Waving, and you&#39;ll be left no wiser than before.&lt;br /&gt;&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Exasperation.&lt;/span&gt; Sometimes a braver student will ask the Prima Donna to clarify something previously &quot;explained&quot; by Hand Waving and Stating the Obvious. If the Prima Donna is not in the mood to do it all again, he&#39;ll inform the audience that he explained it several times already.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Selective Hearing.&lt;/span&gt; If there&#39;s a student willing to press the issue despite the Exasperation, the Prima Donna might choose to answer another student&#39;s unrelated query or to simply go on to the next topic.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Condescension.&lt;/span&gt; You never know when you might get lucky: sometimes a Prima Donna will even give away unearned points when correcting an exam. But don&#39;t worry, he will make sure to inform you about it.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Objective Subjectivity.&lt;/span&gt; He might be teaching you how to solve problems that have more than one correct solution, but you better make sure you solve the problems the way Prima Donna taught you to do it.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Asserting the Authority.&lt;/span&gt; The Prima Donna will make sure you don&#39;t forget that he&#39;s the professor and you&#39;re the student, which means you owe him certain respect. He&#39;ll be especially sensitive about certain forms of disrespect, such as disagreeing with him.&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Showcasing the Merits.&lt;/span&gt; In case you don&#39;t seem to be convinced of the Prima Donna&#39;s authority, you&#39;ll be informed of one or more of his certifications, nominations, recognitions, citations, publications, you-get-the-idea-tions.&lt;br /&gt;&lt;/li&gt;&lt;/ul&gt;Occasionally, you might get really unlucky and wind up with an embodiment of the archetypal Prima Donna. If there is an option to get the authorities to replace him, go for it.&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight: bold;&quot;&gt;Damage Done&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;It&#39;s becoming increasingly popular to be a nerd. Sure, you&#39;re still going to be bullied in school, but at least now you know you might be in store for a bright future. And those around you know it too. People tend to pay attention to things that might make them rich and when &lt;a href=&quot;http://en.wikipedia.org/wiki/Bill_Gates#Early_life&quot;&gt;one of the richest persons&lt;/a&gt; in the world is downright nerdy, you can bet it won&#39;t go unnoticed.&lt;br /&gt;&lt;br /&gt;The net result is that you get a lot of CS students who are in it for the glory and riches. These people don&#39;t know a lot about computers, but that&#39;s okay, because they&#39;re there to learn. The problem is that a lot of them also lack the talent. They&#39;re simply not good at computer science or any natural science at all.&lt;br /&gt;&lt;br /&gt;Imagine all these people trying to learn about pointers from a Geek professor. Confusion abounds, chaos reigns supreme and in the end a lot of them will either fail the course (and hopefully &lt;a href=&quot;http://www.youngmoney.com/careers/monstertrak/career_fields/037&quot;&gt;change major&lt;/a&gt;) or pass the course having learned nothing.&lt;br /&gt;&lt;br /&gt;Now imagine them trying to learn about pointers from a Prima Donna. Apart from becoming utterly frustrated and dejected, they&#39;ll learn a lot of garbage. The things that will stick the most are the parts that the Prima Donna invented or misinterpreted, because those are the most vivid explanations the Prima Donna will offer. That&#39;s the core of what makes him an &quot;eminence&quot; in the &quot;computing world&quot;.&lt;br /&gt;&lt;br /&gt;And &lt;span style=&quot;font-weight: bold;&quot;&gt;now&lt;/span&gt; imagine yourself working with them on a project. Imagine working on a piece of code that has to traverse a tree with a co-worker who believes that &quot;recursion is for programmers who have no style.&quot; Imagine hunting down an interoperability bug caused by bad handling of character encoding in a code written by a person who has been taught that &quot;ASCII table is physically implemented in the microprocessor.&quot; Imagine writing business logic that has to use a database designed by someone convinced that a transactional database must satisfy the fifth normal form.&lt;br /&gt;&lt;br /&gt;Not a cheerful thought, is it?</description><link>http://beardseye.blogspot.com/2007/01/uneducational-crisis-1-human-factor.html</link><author>noreply@blogger.com (Anonymous)</author><thr:total>0</thr:total></item></channel></rss>