?

Log in

t3knomanser's Fustian Deposits

More drek than you can pull from an elephant's arse.

How Random Babbling Becomes Corporate Policy

run the fuck away

Mad science gone horribly, horribly wrong(or right).

Navigation

Skipped Back 30

May 4th, 2009

Antivaccination Deaths

Share
abort!
An infant died of whooping cough in Australia recently. She was too young to be immunized, but if the adults she came in contact with had been, she would have benefited from herd immunity, and would still be alive.

Out of curiosity, I ran some numbers on vaccinations.

Let's be very generous. Let's posit that this utterly unsubstantiated and implausible link between autism and vaccination exists.

What's the rate of incidence? There's 6 in 1,000 people that have some sort of autism spectrum disorder. 2 in 1,000 have true autism. We'll work with that number. Let's assume that every member of this 0.2% is properly diagnosed and the disease was caused by a vaccination.

I repeat: these assumptions are very generous. Even if vaccines did cause autism (they don't), we know that there are other causes as well. These are very generous assumptions to make.

Now, let's look an measles. In an vaccinated person in a developed country, 3 in 1,000 people die- 0.3% fatality rate. In underdeveloped nations, it's closer to 280 per thousand. In immunocompromised patients, like AIDS victims or cancer patients, it's in the same neighborhood- about 300 in 1,000.

And that's just measles. And that's just deaths- we're not counting complications like corneal scarring- yes, measles can blind you.

So, even if we grant the most generous possible claims made by the anti-vaxxers, their arguments don't stand up. Measles, alone, is a more credible threat than vaccine induced autism, even if every autism case was caused by a vaccine. Even if we take the absurd claims at face value, the argument doesn't hold up.

Oh, let's keep going. Whooping cough kills 600,000 people a year of the 10-90 million it infects. Why the big range? It occurs mostly in third world countries where it's hard to get good statistics. Let's pick in the middle- say, 60 million cases. That's a 1% fatality rate. Heck, even if we go out to 90 million, we're still looking at a 0.6% fatality rate- which is the same rate of autism spectrum disorder in the population.

Between measles and whooping cough, we're talking a 0.9% fatality rate. Wanna start adding diseases? Polio isn't extinct, you know.

Ignoring the non-existent autism link, there are real risks to vaccines. The CDC has some data, but it should be perfectly clear: vaccines are less dangerous than the disease they prevent.

April 25th, 2009

I love functional programming. I'm going to present on doing FP in VB.Net in August- this is a feature new to .NET 3.5. I was very excited about the ability to do functional programming in VB.Net. And some of the key features I really wanted, currying and closures, are there.

But the limitations... they almost make it not worth the trouble. Most obviously, VB.Net doesn't support true lambdas. In a true lambda environment, I could do something like this:
f = Function(x as integer) if (x % 2 = 0) then return x / 2 else ... some other code ...
In true lambdas, you can put any code you like inside of your function. C# and F# allow this. VB.Net does not- VB.Net only allows expressions (you can't do ifs or loops or anything like that).

Still, there's a lot you can do with expressions, so that's not too bad. Since you get closures, you can work around that without too much trouble.

But then you start running into the bizarre things. I wanted to do a patterned call. In many functional languages, you can create functions like this: f(1) = 1; f(2) = 2; f(x) = f(x - 1) + f(x - 2);. Calling f(13) will print out the 13th number it the Fibonacci sequence.

Now, I accept that VB.Net wouldn't have an architecture like that built in- it's a somewhat obscure functional trick. But I was hoping I could roll my own. My first attempt at it was to come up with a compiler macro- oops! VB.Net doesn't support pre-proccesor macros. C# does, of course.

Well, okay, what about Attributes? .NET has the ability to define metadata on code, that you can "reflect" on to change runtime behavior. I could do something like this:
Module 1
  '"fib" is the function name, the second parameter is the pattern- if it returns true, execute this
  'operation, otherwise, go find another operation named "fib" to execute.
  <PatternedFunction("fib",function(x as Integer) x = 1 or x = 2)> Function f1(x as Integer)
    Return x
  End Function

  <PatternedFunction("fib",function(x as Integer) true)> Function f2(x as Integer)
    Return PatternedCall("fib")(x - 1) + PatternedCall("fib")(x - 2)
  End Function
End Module


Not as concise as I would like, but hey, it works, right? Wrong. Turns out, since Attributes are evaluated at compile time, you can only pass constant expressions into them. Since a function may possibly contain a closure (even though this one does not), you can't ever treat a function like a constant.

I'm being an FP snob, and I know it. My reason for wanting to do this is less because I have a specific need and more because I want to. I get frustrated when a language implements a potentially awesome feature in a half assed way, but as I think about it, "Potentially awesome, practically useless" describes VB.Net to a "T". If I had my druthers, we'd be a C# shop if we were doing Microsoft at all- I think a big portion of our business would be streamlined by a real RAD language, like Ruby or Python.

All of my complaints would be patched if VB.Net supported compiler macros. I'm stunned that it doesn't- it's not exactly the hardest thing on earth to implement; your average C compiler has had a macro pre-processor since before I was born. C# has one. And here's the real kick in the teeth: most macro engines are language agnostic, so there's no reason they couldn't have wired the C# engine onto VB.Net.

Basically, this is an exercise in driving home the flaws of the language that pays my bills.

April 21st, 2009

Gravitone

Share
run the fuck away


Here's a demo of my current project- an iPhone app called "Gravitone".

April 12th, 2009

Ever since moving to Pittsburgh, I've kinda been avoiding the Warhol Gallery. I don't like Warhol. I don't generally care for pop art (although this is pretty cool).

But, a friend volunteers/works there and managed to score tickets to a beer tasting. Hey, free beer? I'll take a look at any art gallery if free beer is involved.

The local brewery they chose, Stoudt's, didn't have much to offer that I was blown away by. Good, but not spectacular stuff. Very Belgian style, even in their "American" beers. But that's not the real point of this post.

After enjoying a generous helping of beer, we took a stroll through the gallery. A lot of it, unsurprisingly, was dedicated to Warhol himself. There were some other modern installations too. And it was all, generally, crap. As I'm walking through there, seeing wall after wall of celebrity snapshots and pop culture ephemera, "recontextualized" as art, it really struck me:

If pop art were a person, it would be the gum-snapping coworker in the cube over from you with some ear-wormy ringtone on the too-loud-cellphone that natters on and on and on about Branjelina and Brestica or whatever while gossiping about how great the next "Sex and the City" movie is going to be.

That's my take away from the Warhol gallery. A large amount of pop art focuses on taking the inane, vapid and annoying and "recontextualizing" it as art. The result is art that is inane, vapid and annoying.

Now, after venting my spleen, let me cushion the blow with a hint of perspective. A large part of my reaction is that it simply hasn't aged well. The reason people appreciate Warhol is that he altered the definition of what constitutes art. That's not entirely a good thing, and he was certainly standing on the shoulders of giants, like Duchamp. But teleporting myself back to the 50s and 60s, seeing giant Elvises shooting at me from a wall would be jarring, and that's obviously the desired effect.

One solace in the Warhol gallery: Warhol hated Pittsburgh. Reviled it. So, the fact that the town has a little shrine to him and has named a bridge after him stands out as a little "fuck you" to Warhol, and I can live with that.
Not content to have exceeded the assignment's requirements by adding a graphical layer, I went a few steps farther: I added support for animation and multitouch gestures. The latest version of the source lives here, and it's chock full of comments this time.

Once again, it's under a Share-Alike-CC-license. Don't hand it in as your homework.

April 11th, 2009

iPhone Programming : CS193P

Share
tesla
Stanford University is publishing video and assignments for their iPhone programming course online. I've been following it and doing the assignments, and man... I miss compsci classes. I've been having so much fun doing this. For those that recall me in college, I was a lazy underachiever in most of my classes- but not the programming ones. In those, I always took the assignment and exceeded the parameters. I would show other students how to do the assignments. I'd add little flourishes.

This has been such a breath of fresh air. By day, I slog through tedious code written in tedious languages to do tedious tasks. By comparison, programming on the iPhone is downright sexy. It's fun, it's fast.

But more than that, I'm enjoying my remote college experience. My brain is getting a gentle stretching, and I really like that. Of course, it's very gentle- the course material goes at a painfully slow pace and is treading over the basics of OOP with a leaden step. But then I pick up the homework assignments, and run past the requirements and show off, and I don't care how dull the lectures get.

Not to say I get nothing from the lectures. I finally "get" Objective-C memory management. ObjC has an approach that's someplace between Java-style Garbage Collection and C style malloc/free.

In any case, I've got my first non-trivial iPhone application done. The business logic is pretty trivial- do some stuff with polygon shapes- but the UI has drawing and animations, which is well beyond the goals for the current homework assignment. If you have an Intel Mac, you can download the SDK from Apple (free signup required) and run it if you like. The linked code is distributed under a CC-share-alike license: HelloPoly code.

If anyone is dumbshit enough to try and hand this in to the class, they're going to get owned, because it's pretty obviously not what the assignment called for.

March 28th, 2009

Penis Train

Share
run the fuck away
I get back from the opera, after a wonderful night out with my wife, and the Internet gives me this:


Well, it's no La Bohème, but it's got a certain charm.

The Pittsburgh Opera Company did an excellent job of putting on La Bohème. It's not one of my favorite operas, but they did a great job. We have to do this and the symphony more often.

March 25th, 2009

Against the Cloud

Share
tesla

Screw you, Cloud


I hate the term cloud computing. I'm generally opposed to the idea of it. I know that this goes against the current fad, but I really really have a philosophical problem with it. Before I go into a long-winded history lesson to make a point, I want to address one point first: the utter stupidity of the name "cloud computing". When IT people make diagrams, there are often areas or elements that are out of scope, for example, when talking about how data gets from your computer to Livejournal and back, it's superfluous to explain all the routers and network hops in-between. For most applications, I don't need that kind of detail, so I just draw a cloud there and label it "the Internet". A cloud means the details are unknown or unimportant to the problem at hand.

"Cloud computing" takes its name from that. "Where does the application live? The cloud!" A market-friendly way of saying, "Don't know, don't care," which conceals the fact that we do know and should care.

In the beginning...


Let's rewind the clock. A long, long time ago, computers were expensive. Processing power was expensive. Storage was expensive. Because of this, computing resources were jealously guarded. Slowly but surely computers made the transition from batch processing to interactive sessions, and it's at that point something interesting happened. We finally reached the point where we could have programs that interacted with the user. In a time sharing system, we could have a bunch of programs running and a bunch of users talking to them, and it all happened fast enough to seem instantaneous to the end users.

This was the birth of the dumb terminal era.
Read more...Collapse )
I have a vision of a device. The super-smart anti-terminal. It's small, maybe the size of a very large wall wart. Inside, it's got a lightweight processor, a decent sized laptop HDD or SSD, wireless Ethernet, maybe Bluetooth/wireless USB. It plugs directly into an outlet. It has a USB port or two on the outside.

Out of the box, it's got an embedded Linux server running. You plug it in, connect to its "configuration network" and follow through a wizard like you would when setting up your wireless router. It joins your wireless or wired network, it registers a domain name (or a free dynamic DNS subdomain), and points it at your network. As much as possible, it automatically sets itself up with port forwarding on your router.

The entire point of all of this? All those cloud services? Google Docs, Twitter, etc? They run here. Out of the box, it's got WordPress, Laconica, a web based office suite, and acts as a remote file server. It can stream media, it can track bittorrents. Using some of the open social networking standards, it becomes your social networking identity. Accessible via the web, via phone client software. Options to mirror an encrypted disk image to an offsite backup.

Now we get the benefits of thin client access, anywhere access, etc.- but without reverting to the maniframe age with slightly different branding.

All the pieces are there to make this technology work. What really needs to happen is to have someone sit down and really work out the mechanics of doing this so that it really is plug-and-play. It needs to be as easy to configure as a Wii. Not an easy task, especially considering the complexity of the problem being solved.

In Conclusion


I think "cloud computing" is a horrible term. It's centralized computing. And I think it's a bad thing. I think putting things on the edge, where the users live, is a much better idea than putting it on Google or Microsoft's servers.

And have you noticed that? Not to go all tinfoil hatty here, but have you noticed that it's giant companies that want to hold your data for you? They want everyone using thin clients on tiny little netbooks, and that gives them the power. They have the data, they have the processors and the storage. I'm not claiming that there's any conspiracy to weaken the public- but conspiracy or no, that would be the result.

March 15th, 2009

Amazing Stories, MAR1943

Share
run the fuck away

Amazing Stories, MAR1943
Originally uploaded by t3knomanser.
Purchased at the Caliban Bookstore on Craig St. in Oakland. For starters, that is an AWESOME bookstore. And this, for $9.50 is an awesome book.

One of the stories advertised on the cover, "Victory from the Void" has a most promising capsule summary in the table of contents: "An asteroid arrives in the Solar System and circles the Earth - and becomes a base for Nazi bombers!"

Caliban had a whole pile of these, but this had the best cover.

March 1st, 2009

Teller

Share
Rainbow Me
A fun article about Teller (of Penn &), and his Red Ball trick illusion. I must tell you, seeing it actually done is more amazing than the article really gives credit for. It's one of the most perfectly executed effects I've ever seen, and it's done so simply, with an utter lack of pretension. It is a perfect trick.

February 27th, 2009

Musings

Share
run the fuck away
So, every once in awhile, with Heavy Metal pounding in my ears, my fingers on the keyboard typing "JETPACK SHARK" for the millionth time, building the ridiculous (and now finished) first draft of the short (it's 19 pages, but my target is 15) intro issue for Jetpack Shark, I stop and think: holy crap, was that a tortured sentence or what?

No, no, not that. I stop and think, "Hey, this is kinda ridiculous and hyperviolent. Is there a market for this, really?"

Thankfully, the latest Tales from the Longbox got posted today. For those unfamiliar, author Protoclown takes comics (bad comics), and rips the shit out of them with scathing commentary. It's like MST3K for comics.

And you know what? When I compare my ridiculous, silly comic to Marvel's rapidly decaying "Ultimates", I notice a few glaring differences. For starters, events make sense and happen for a reason. A causes B which causes C. Even when the events are structured for sheer awesomeness and play fast and loose with reality or physics there's an internal logic that carries things forward. Sure, all the characters are paper thin cartoons, but... well... it actually makes more sense. Again, it's got it's own internal logic.

Also, nobody ever says: "I thinketh it cuteth." Seriously.

Reading that and knowing that these people are getting a comic published simultaneously makes me feel good about my writing skills and weep for humanity.

February 26th, 2009

You know me. I get ideas. From Recursed, the time travel game, to all sorts of other things. Writing, reading, programming, whatever.

Well, I've had another idea. It's turned into a mostly finished first draft of a script for a comic book.

Jetpack Shark: The Comic.

The title character, of course, is a shark with a jetpack. The setting is a post apocalyptic wasteland, with one final outpost of humanity: Oceanic City, a paleofuturist floating, domed city. Jetpack Shark has taken it upon herself to defend the city from all sorts of enemies: dinosaurs, robots, aliens, alien robots, robot dinosaurs, alien dinosaurs and alien robot dinosaurs. It's not an altruism thing, she's no hero. She's a shark with a jetpack. In exchange for her saving the city, she eats a few school children now and again. They're young and delicious.

The flavor I've been going for in the script is the narrative of a young kid playing with toys on the kitchen floor. I mean, these are the sorts of stories I made up as a kid. I mean, talking dinosaurs in giant speedboat just make sense in that context. It's crazy and wacky and emphasizes fun.

I need an artist.

February 21st, 2009

For Minna's birthday, I made her a dark chocolate cheesecake using this recipe. This is a bit unfair, really- this recipe should carry a surgeon general's warning. It's incredible, and probably about as dense as electron degenerate matter. And chocolate.

One warning: while not the hardest recipe on Earth, it's definitely the sort that requires some baking experience and some patience. And big bowls. I don't have any big bowls. And something with a bit more oomph than a hand mixer (I was deathly afraid I was going to burn the motor out).

February 10th, 2009

Recursed: Open Playtest

Share
run the fuck away
A bit ago I mentioned the core ideas for Recursed. Well, after registering the most awesome domain name ever, I set up a Wiki, and went to town putting together the core rulesets and some printable playtesting supplies.

The wiki has all of the core stuff, even if most of the pages are protected against edits. The Talk pages aren't, and that's really where I expect the action to be.

So... have at!

February 5th, 2009

Mike Nelson, of MST3K and RiffTrax fame, is taking it upon himself to eat nothing but bacon for a month. Rifftrax does live webcasts every couple of weeks, and I'm curious to see the horrors wrought on him by the second week of this experiment- if he keeps up with it that long.

You folks know me. I love bacon. But even I could not eat nothing but bacon for a month. Bacon with a side of raw broccoli, Mountain Dew and the occasional slice of pizza... well, not since college.

February 2nd, 2009

For Christmas, Minna got me the boxed set of the "Man with No Name" trilogy; the famous Clint Eastwood spaghetti westerns filmed by Sergio Leone. I'm a sucker for these kinds of movies, and we've been enjoying the brooding anti-hero.

I was discussing this recent Clint Eastwood kick with a co-worker, and she said, "I never got into Eastwood. He's a bit of a one trick pony."

"Yeah," I replied, "but it's a good trick."

Leone had a similar comment. He was reputed to say that he loved working with Eastwood, and that he had two expressions: one with a hat on, and one with it off.

I didn't develop an interest in Westerns until college. One of my professors assigned High Noon as extra credit. He absolutely raved about the film, and I respected his opinion, even if a black and white Western from the 50s didn't seem like my cup of tea. And one can always use extra credit.

I was pleasantly surprised to find my preconceptions about the genre subverted. High Noon was a film that didn't trot out the idea that, a manly enough hero could thwart any obstacle. It had emotional depth. A 1950s movie where the male lead cries is somewhat landmark. Especially a Western.

In my recent research, I learned that High Noon marks one of the early "revisionist Westerns". An offshoot of the genre that broke a lot of the traditional conventions. It's a good way to identify the sorts of Westerns I like versus the ones I wouldn't care for. I find it interesting that John Wayne only ended up doing revisionist Westerns when he didn't know he was doing it- films that were subtle sendups of the traditional John Wayne character.

I'm drifting off topic; I didn't mean to turn this into an analysis of the Western genre. In fact, I have a specific and interesting anecdote to relate.

In 1929, Dashiel Hammet, in a novel called Red Harvest, introduced the character of the "Continental Op" to Personville (often called "Poisonville" by the characters). The Continental Operative is a nameless, hardboiled investigator working for (and occasionally against) the Continental Detective Agency. Personville is a lawless town ruled by warring crime gangs. The Continental Op plays both sides against the middle, and ends up bringing peace to the town (by getting the gangs to kill each other), and profiting a bit in the process. It's one of the quintessential noir novels, and it's quite a good read (the better of the two "Continental Op" novels, I can't speak to the short stories).

In 1961, Akira Kurosawa directed Yojimbo. A nameless Ronin comes into a town ruled by warring criminal gangs. He convinces each gang to use him as protection against the others. He brings peace to the town by getting the two gangs to kill each other. Sound familiar?

Not a novel story, perhaps. But Kurosawa freely admitted the influence of Red Harvest on his work.

Sergio Leone was a big fan of Kurosawa's work; it influenced a great deal of his directing style. When he made, A Fistful of Dollars (a story about a nameless gunslinger who cleans up Western town ruled by competing crime families by getting the two gangs to kill each other off), he leaned heavily on Kurosawa for the look and style of the film.

Intended as a tribute to Yojimbo, Leone found himself in a copyright dispute. Kurosawa's production company claimed that A Fistful of Dollars was an unlicensed remake of Yojimbo. The courts agreed and a chunk of the gross went to Kurosawa's company, along with exclusive distribution rights in the Asian markets.

I would argue that Leone got screwed on that one. It was hardly a unique story idea, and it's at least as distant from Yojimbo as Yojimbo is from Red Harvest. I guess that's why I'm a programmer and not a judge. But the interesting turn comes next.

Leone had a falling out with his production company, and when he went to make the sequel to Fistful, named, For a Few Dollars More he got someone else to produce. His old production company sued, claiming they owned the rights to the characters from Fistful, including the character Eastwood had played (called "Joe" in the script, although only the gravedigger ever used that name).

This time, the court (presumably a different court), found that the archetype of the Western gunslinger wasn't unique enough to copyright. Never mind the fact that Eastwood's character in A Few Dollars more was not only the same character (although called "Monco" a few times in this film), he wore the same poncho. Not a similar poncho- the same poncho. For all three films (the third being The Good, the Bad, and the Ugly) Eastwood wore the same poncho (and it was never washed during that time). Then, of course, was the similarity between the two titles, the style of the film, and everything else.

This time, Leone's old production company got screwed. In sports, we would call something like that a "make up call"- a decision so obviously flawed that it could only be taken as an attempt to balance the previous bad decision. I sincerely doubt that this was the case, but I find it fascinating that Leone lost one case on pretty dubious grounds and won another on similarly dubious grounds. Maybe his new production company understood how to hire good lawyers. I dunno.

One last thing: in all of Leone's movies, he used Ennio Morricone for the music. The soundtracks on these films are incredible. I mean, seriously fantastic music. "The Ecstasy of Gold" is pretty incredible, but "Sixty Seconds to What?" gives me chills when the organ cuts in.

Superbowl Celebrations

Share
Megaweapon
As you may have heard, the Steelers won the Superbowl. After the game, Pittsburgh went kinda nuts. Oakland, the nearby college neighborhood got a little too crazy- there was some vandalism and people burning furniture in the streets. Shadyside, on the other hand, was mostly good natured drunks blocking traffic. Southside was a little wild, but not nearly so much as Oakland.

Oakland, of course, is aswarm with out-of-towner college kids that were just looking for an excuse to wreck shit. Southside is a heavy college-kid neighborhood, but it seems to be more locals. Shadyside also has a heavy college bias, but it's mostly the college kids with money.

I got some photos, and below are a few of my favorites. Read more...Collapse )

January 31st, 2009

I got a bug up my ass the other day, and started working on making a set of rules for a miniatures strategy game. Being the sort of person I am, this is no normal combat game- it involves time travel.

Not the whimsical variant of time travel you'd see in a game like Chrononauts, where you're looking down on all of history and playing with it. No, this is a game about time traveling soldiers on a battlefield, where events from the future ripple back into the past and vice versa. Essentially, I wanted to do something more like Chronoton (awesome game, if you haven't played it, btw).

As you can imagine, it's a tricky game to design.

A brief sketch of the rules and gameplayCollapse )
As odd as it may sound, I had an easier time figuring out a workable time travel system than I did working out a combat system. I've got enough of the rules sketched out that I can unit test them, but this raises another problem- with all these Dopplegängers running around, it's going to be a pain in the ass to track all of them. Which brings me to my real question: how would you rather track these sorts of things?

The options I've come up with so far are these:
Logsheet
This is my least favorite option. I don't want to make the players track lots of things with paperwork. It also means that you can't tell the state of the game without asking the other players, "Is this piece the Doppel of this one or that one? How many wounds does it have?"
Token madness
Lots of things have to be represented by tokens. Declarations to travel, commitments to future actions- these have to be easily seen and obvious to all players. But what about using tokens to track doppling? The advantage here is that one can have a bunch of generic "Soldier" units, for example, and identify which piece is Pvt. JenkinsP and which is Pvt. JenkinsF (and, since we could have all sorts of future actions, F1, F2, F3...). When JenkinsP departs for the past, JenkinsF becomes JenkinsP. This means establishing parent-child relationships via tokens, and that means the tokens have to be unique enough that every unit on the field could Doppel multiple times and you could still tell them all apart.
Pre-printed Planning
There's an absolute number of Doppels that could appear for any given unit. For some units, like a Time Master, it could be quite a lot. For your average soldier, it will only be 3 or 4. So perhaps I could print off a bunch of pieces all labeled "Pvt. Jenkins". The programmer in me balks at this approach- we're essentially hard coding in the numbers. I could drop a counter marked "0" on the first Pvt. Jenkins, and when she Doppels, I can put another Jenkins on the board with a "1" counter. When the present day Jenkins travels back, I can move the "0" over to the "1", the "1" to the "2", etc.

What other options do you think I could use? I'm preferring the last one in the list- it seems to be the easiest to track that's also the most visual. At a glance, you can see who's who, and where they originated in the time line.

January 30th, 2009



"Let's stop pretending these Guantanamo guys are all supervillians. They're thugs and jackasses, not Magneto. If they had mutant powers, we would have known by now."

January 28th, 2009



My shipment of the greatest soda in the world isn't coming like I expected. I'm sure that Amazon and UPS will deliver a new batch soon, but woe to this lost shipment.

Somehow, my heart will go ahead on, but I was looking forward to having a few of these for the Superbowl.

January 26th, 2009

Games

Share
run the fuck away
Guitar Hero, the text adventure. Enjoy.

January 25th, 2009

Gran Torino

Share
Megaweapon
As miusheri said, we went out to see Gran Torino tonight. This afternoon, actually. I was stunned to find that the 1:45 showing was still counting as a matinee- I thought that, nowadays, matinees ceased with the first showing. I'm not complaining- $5 tickets on a lazy Sunday afternoon are nothing to complain about.

The film was excellent. You'll hear a lot of complaints about the acting, and they're not unwarranted. A lot of the Hmong actors were pretty bad*. That's okay though, because Clint Eastwood was ready to carry the whole damn movie on his own. As Minna said, it wasn't a film with a lot of surprises to it. You knew where it was going to go the whole time; each beat was planned out and scheduled according to the pattern of this sort of film.

And that's fine. It's okay to use a formula and cliches, especially if you do them well. For all that there were plenty of weak points in the script and the acting, it was well executed and enjoyable. The film hung together not on the strength of the plot or the drama, but the comedic moments.

And of course, the septuagenarian Eastwood as Walt Kowalski brandishing an M1 and growling, "Get off my lawn!" is reason enough to turn out for the movie. And for manycolored, there's this exchange:

(while discussing why the Hmong were relocated to the US)
Sue: Blame the Lutherans.
Walt: Everybody blames the Lutherans.


And to round out the beginning and the end of Eastwood's career, Minna and I watched A Fistful of Dollars last night. I had only seen bits and pieces, she had never seen any of it. An obvious classic, and it's worth noting that the same complaints people have about Gran Torino apply to that film as well. It's also obvious that Eastwood learned quite a bit about using the camera from Leone, although he doesn't have the same flair for the melodramatic.

*Most of the Hmong actors were locals that had never been in anything before. These people playing relocated Hmong trying to navigate American culture were actually Hmong trying to navigate American culture. There's an authenticity that can't be gotten by shipping 50 Asian actors from LA to Detroit.

January 20th, 2009

Government

Share
run the fuck away

Government
Originally uploaded by t3knomanser.

January 18th, 2009

Number Six?

Share
Megaweapon
The Steelers are going to the Superbowl. The Ravens got a much deserved beat down, and in two weeks our boys will be dukeing it out in Tampa against the Arizona Cardinals. With any luck, this will be World Championship #6!

January 16th, 2009

The puzzle I posted about yesterday is fixed.

The immediate cause, that has resolved the issue, turned out to be some bad data in the tables that generate the dynamic SQL. A few rows were lurking that had NULL values, and that meant when they got appended to the other values necessary to make the SQL code. When SQL server appends NULLs to other strings the result is always NULL.

Now, I knew this, and I also designed my code with the expectation that there might be some queries with nothing but dates in the WHERE clause. So I threw in a COALESCE() call that says, "if the WHERE clause is blank, replace it with '1=1'". It's just a placeholder, because there's also some logic that appends "AND period_year=2008...". To keep the "AND" syntactically correct, I just wanted to throw in some condition that would always evaluate to true.

So, the intended behavior was that, if there was no WHERE clause, it would generate the following: "WHERE 1=1 AND period_year = 2008...". There was only one problem: I appended the "AND period_year" bit before I coalesced to 1=1.

Follow with me:
@whereClause is NULL.
I append "AND period_year" to @whereClause, and the result is NULL.
I then COALESCE to convert the NULL to "1=1".
I then send a query over to the remote system with the following WHERE clause: "WHERE 1=1".

And the query never completes. That was the immediate problem, and it's fixed now, but it raises a few other questions:
1) Why did the query complete when run on its own?
2) When this query was #11 in the sequence, why did the output hang at query #6?

My best guess answer to question #2 is that SQL Server's output was lying. It just didn't flush the output buffer and so it told me I was on query #6, but really it was busy hanging on query #11. Which is interesting behavior, and only reinforces how damn difficult it is to debug TSQL.

The moral of the story, of course, is that dynamic SQL is bad. I hate it, and I hate having to do it. This whole application is one of my least favorite projects, and sadly, there's really no way to avoid getting stuck with dynamic SQL given the objectives, unless I wanted to write it in .NET. Oh, that's right, I did want to write it in .NET or SSIS, but the user demanded that each query be implemented via a stored procedure.

Since SQL Server doesn't let you specify the linked server name via a variable, there is absolutely no way to do this without resorting to dynamic SQL. It's awful. Awful. I know people find Oracle frustrating to administer, but when it comes to programming in PL/SQL, I know that I'm not going to bump into arbitrary seeming restrictions, like "functions can't modify temp tables, even if the temp table is a locally created table variable, because we just don't like letting functions modify data, even if the only way to get data back from dynamic sql in Sql Server is to have it insert its results into a temporary table".

Ugh. Lesson learned: next time a customer specifies a technical preference for implementation details, I politely tell them to go fuck themselves and do it my way.

January 15th, 2009

I have a puzzle at work. My puzzle works like so.

I have an application that lives in a SQL Server 2005 database. It builds dynamic SQL based on snippets of code stored in DB tables, and then executes that dynsql against two other databases to get a single number back for each query I run.

So, the general flow is that for location X, I lookup the bits of the first query, execute it against a remote database, and get a number back. Then I move onto the next query. Once I have executed all the remote queries, I update a DB table based on the results.

I have one entire class of locations that all use the same basic query with some variations in the where clause. The account numbers differ, things like that. Nothing big between any of the locations. And all of these locations work- except one.

For one location, partway through the chain of queries it executes, it hangs. If I pick it apart and run one query at a time, each query completes in a timely fashion. No one query hangs. If I start re-adding queries back in to the process, I find that at 10 (of 15) queries, it works fine. But when I execute eleven or more queries, it hangs. These queries all execute serially, so it's not a threading issue or something like that.

Now, here's the kicker: it doesn't seem to matter which 11 queries I execute. Any combination of 11 seems to blow up. And it's not the 11th query that hangs- it hangs on query number 6 pretty consistently. I have about 10 other locations that do the same exact thing, and they all work fine. It's just this ONE location that this happens to. The hang appears to happen when accessing our Oracle Financials system only, never the SQL Server Data Warehouse we get some of our data from. It looks like it's hanging on the remote system, not my local SQL Server.

Suffice to say, I and everyone I've shared this with is utterly perplexed. If it were something logical, like the last query being the source of the hang, or other locations also being problematic, or one specific query that was always failing, I'd be able to dig in and fix it. These inconsistent and irregular bugs are the absolute worst.

Your bonus mindbending physics: the first evidence that we may really be living in a 2D space.

January 13th, 2009

Project Assement

Share
Rainbow Me
I've got a load of writing projects that I just haven't had the energy or impetus to work on recently. Some fiction, some non-fiction. Hell, some of it's software. Despite having a job as a programmer, I haven't done very much programming at all recently.

In any case, I'm just dumping a bunch of half finished ideas here behind the cut. Comments, insights, or whatever would be appreciated. Read more...Collapse )
Powered by LiveJournal.com