My evil TFS plans (originaly published 3/1/07)

I have decided that the next time I find myself in charge of a Team Foundation server, or more precisely in charge of maintaining a TFS process template, that I will introduce a Work Item status of "Nuh uh".  This status would be almost exactly like a "Re-test" status, but without implying that you have actually fixed anything.  Instead, the implication is that the system in question works perfectly fine the way it is, and that the QA tester is, in fact, horrribly mistaken.

I think it will be very useful.  But I’m sure the testers will immediately come back with a request for a "Yuh huh" status to counter it, and the whole team may spiral into some very childish behavior.  It should be quite entertaining.

Posted in Uncategorized | Leave a comment

Where VB beats C# (Originally Posted 2/1/07)

Technorati Tags: ,

Some people are screaming “Stone the heretic” already, but as a developer who has spent a lot of time in both the VB and C# worlds, I have to say that VB wins out in some areas that appeal to me personally.  I’m not talking about the usual “I can do it in fewer clicks that you” crap, either.  I’m talking about fundamental elements of the language itself.  Before VB.Net and C# existed, I was a young VB6 developer.  It paid the bills, and for a number of reasons I just couldn’t stomach C++.  I tried, really I tried, but something always struck me as “wrong” with the whole thing.  It wasn’t so much the language syntax, I was fine with that.  It was more a matter of how much baggage the language had as a result of being more or less a tacked-on extension of C… a venerable (meaning “really friggin’ old”) language.

The thing is, I didn’t like any of the other “real” languages much either, because they allwere carrying around the same baggage.  It was things like header files that I hated.  Header files were fine in their day, and served a purpose… in the early 70’s.  When you were scheduling time on a mainframe that was running with 4k of Core memory, you didn’t have the luxury of loading a whole solution at once like we do now.  Inter-project dependencies would have been a definite non-starter.  You had to compile each piece separately, and provide a shorthand version of the results to the next compilation step.  Header files provided a way around the very real limitations of the systems of the day.  Semicolons also had a very real purpose.  Your behemoth mainframe couldn’t do the natural language parsing that modern compilers can.  You had to tell it “okay, I’m done, go ahead and process that bit.”  As for the curly braces vs. end statement thing?  Tomaytoes tomahtoes.  Who cares, really?

The point is that these languages were designed to make life easier for the compiler, not the developer.  That’s why we had For loops that look like this:

for (i=1; i<=10; i++)

{

}

Seriously dude… WTF is up with that?  To experienced programmers this is easy to read, but can you honestly tell me that without a background in the C world, and asked to design a new language from scratch, that you would have written it like that?  I don’t think so.  It’s baggage, and it has carried forward because “That’s the way it’s always been done”, which incidentally is widely considered the most dangerous phrase in the business world.  When we sit down at a client, and they simply want to replicate their manual business processes on-screen, we try to talk them out of it.  We want to re-engineer their processes to make them more efficient, not merely automated.  Just shoveling paper forms onto the screen is a process we used to call “beurocramation” at my first consulting gig.  So why is it that we’re so unwilling to part with the “old ways” when it comes to our ownprocesses.

When the whole world was running on C, I often said “I’m waiting for D”.  In the meantime, I was a gainfully employed VB, and sometimes PowerBuilder (shudder) developer.  When C# came out, I looked at it and decided that it was enough like D-flat for my taste and made the jump into learning it.  At this point they’d radically changed VB so that the two languages were largely identical anyway.  I still did the majority of my work in VB because that’s just the way things were at my employer, it wasn’t so much a “that’s the way we’ve always done it” thing… it was more like a total lack of any compelling reason to change.  After all, anything you could do in one language, you could do in another (Yes, I know there’s an approximate 1% difference in actual capabilities… the edge cases are not my point).  Certain clients wanted things in C#, others in VB, and here’s where I really started to realize just how similar the two languages really are.  C# is simply like VB being spoken by Yoda.  Instead of saying “Dim index As Integer”, you say “int Index;”.  I’m convinced, by the way, that this is one of the primary reasons why C is so popular worldwide.  The majority of human languages are not ordered like English.  Mostpeople in the world don’t say “the red ball”, they say “the ball red”.

C# has kept the terseness of the C languages, but eliminated some of the things that kept me away all those years.  For instance, there are no more header files!  We’ve moved into the new millennium at last, and accepted that perhaps the computer could be doing some of that grunt work for us.  After all, we sit around all day writing applications to make other people’s jobs easier, why shouldn’t someone be doing the same for us.  But at the same time it kept some of the minor irritants that used to grate on my nerves… things like semicolons.  As I said before, they used to serve a distinct purpose, helping a less-than-brilliant compiler to find its way.  But leave one off in an editor like Visual Studio and what happens?  You get a squiggly underline, and a message saying “Dude, you forgot the semicolon”.  Well if the editor can tell me that I forgot the semicolon in real-time, then I don’t really needit anymore, do I?  It’s not serving any purpose other than preserving an arcane syntax.

The semicolon is baggage, it’s legacy, it’s just there because “that’s the way it’s always been done”.  If they had taken out the semicolon, the whole C developer world would have had a massive collective conniption.  It would be like someone had decided to use parentheses instead of curly braces.  But think about it objectively for a minute.  Are multiple-line statements the majority?  No, they’re not.  So why are we putting semicolons on 95% of the lines that don’twrap, just to make the 5% that do happy?  We constantly try to code for the “majority case”, but once again we’ve taken a “do as I say, not as I do” attitude.  We vehemently try to steer our clients away from the same types of behaviors that we willingly accept for ourselves.  With C#, Microsoft really had an opportunity to make major changes, and didn’t make them.  Instead, they put those changes into VB, and left the C-derived C# largely the same.  And that makes perfect sense… just look at the name.  People didn’t want Microsoft radically changing the way VB worked either, and I think they succeeded.  That’s why VB still doesn’t do short-circuiting logic by default, instead introducing new keywords like “AndAlso” and “OrElse”.  By the way, I find these truly appalling.  Words simply cannot express the sick, ashamed feeling I developed for the VB language the moment they were introduced.

While I’m on the subject of keywords; I am often heard saying that the hardest part about building APIs is deciding what to call things.  Names and keywords are of primary importance to me because I feel that you should be able to explain what you’re doing to the client in plain terms.  Some people disagree and call things the most obscure names they can think of so as to maintain the carefully cultivated air of mysticism their clients view them with.  I’m talking about things like LLBLGenPro using the term “Predicate” instead of “Condition”.  I had a discussion with the development team at the Ohio Department of Education about how arcane some of the LLBL namings were.  Another developer seemed genuinely offended that I had a problem with what to him was a “perfectly ordinary computer term”.  This was, incidentally, the kind of developer who spends a lot of time primping his wizard robes before meeting with the client.  It may be a perfectly ordinary “computer term”, but why should it be a “computer term” at all, when it’s a fairly universal concept?  It doesn’t have to do with some secret thing that only computers have to be concerned with like memory allocation or object serialization.  It has to do with a universal concept of how to describe what you want.

I think that objects, properties, and methods should do what they say, and say what they do in plain language.  And here’s where we run into another couple of “baggage” terms; “virtual” and “abstract”.  Forget what you know about computer languages for a minute, and tell me what those words mean to you.  The American Heritage Dictionary defines the word virtual as “Existing or resulting in essence or effect though not in actual fact, form, or name.”  But virtual methods actually do exist, and actually dostuff if you leave them alone.  If you don’t override them, they perform like any other method.  So they don’t resemble the dictionary definition of “virtual” at all, do they?  VB had the luxury of having no baggage, or at least significantly less, so VB has the keyword “Overridable”.  It says what it does in plain English.  “Abstract” is defined as “Considered apart from concrete existence.”  This one hits pretty close to what it means… but it’s still not as clear as “MustOverride”.

The only reason developers know what terms like “abstract” and “virtual” mean is that we had to LEARN what they mean.  And while “predicate” and “abstract” might technically say what they mean to someone properly armed with a dictionary, they’re doing it in very obscure ways.  Here is where VB puts one in the win column for me.  “Overridable” and “MustOverride” tell me exactly what they do.  Of course, the concept of what it means to override something overridable is still something you have to learn, but if you were to change the very conceptof what I’m talking about then we’d be talking about something else, wouldn’t we?  It’s like my favorite answer to the eternal question “Why is the sky blue?”  Because if it was a different color, you’d be asking me a different question.

So where does VB beat C#?  It’s largely in areas where C# is constrained by its own history.  Places where the original terms or structures were thought up by designers without the same kind of regard for their product’s users that we insist upon from ourselves.  Places where the language designers indulged their desire to put on wizard robes and play Oz rather than simply showing the users the man behind the curtain.  People will talk about one language being more “productive” than the other.  Personally I don’t see it.  I see differences in the every day, mundane tasks of writing lines of code.

I enjoy being a C# developer, and I’ve come to terms with the irritants that I would have done differently if it were up to me.  But it wasn’t up to me, was it?  I don’t neccesarily want to be a language designer.  It’s like the old saying about art.  I may not know much about language design, but I know what I like.

Posted in Computers and Internet | Leave a comment

CodeMash followup (Originally Posted 1/23/07)

Technorati Tags: ,

Overall, an excellent conference.  I’m hoping the reputation carries over to next year, and it will be even larger.  Some of the sessions were rather basic if you are already familiar with the subject matter, but that was kind of the point of the conference.  You were supposed to go to things you’re not familiar with.  It’s an interesting concept.  I’m not sure I’m personally the right kind of person for that.  My brain tends to fill up, and things get flushed from the cache, so I might avoid taking a session about Ruby, or AJAX while I’m working on a WinForm application. 

It would be very cool if the sessions could all the sessions could be filmed next year and posted, so that we can go back and watch the sessions that conflicted the first time around.  Of course, that means having someone in each session to do the filming.  I think it could be pulled off, though, as long as the speakers can restrict their movements to a pre-defined "stage" area, so we don’t need a live camera operator.  Set a mini-dv up on a tripod, aim it at the speaker, and change the tape between sessions.  I think it would work.  Perhaps that could be part of the responsibilities of the session Proctor.  Speaking of which, I’ll say it just this once more and then let it rest until next year…

"Proctor?! Damn near killed ‘er"

I could download them to my new Zune (Thanks, Sogeti) as long as Microsoft keeps up with the updates on this one.  The hardware seems solid, and performs very well.  My main source of concern on this product is Microsoft itself.  They’ve gone and created a device that doesn’t follow their own standards.  The Zune is not a "Plays for Sure" device, and won’t talk to Windows Media Player… or anything else for that matter except for the "Zune Player", which appears to be a custom, modified, skinned WMP.  Supposedly this is to support Microsoft’s new equally-custom DRM scheme in which every file you send to the device gets DRM injected into it whether you like it or not.  Then, when you beam the tracks to your friends, they self-destruct after three days.  I can see this for commercial tracks that I download through the Zune store, but get serious here folks.  Are you trying to tell me that I can’t beam a co-worker the latest DotNetRocks without it disappearing in three days???  That’s retarded.  I’m not trying to be offensive here with the use of "retarded".  I’m not just saying it’s stupid.  I’m saying there’s something severely wrong with the thought processes that spawned such a fundamentally flawed idea.  Still, to me it doesn’t matter much, since I can load up whatever I personally want from my own computer and have it last forever, and that’s how I intend to use the thing.

I suppose my only regret about the conference is that I never hit the water park.  The first night it was just too late once I got there.  The second night had an organized social gathering… and then it was too late.  The final day of course I had to get out of the hotel.  Oh well, I don’t want to go frightening the children anyway.  6’2" overweight pasty programmers can scar children for life.

Posted in Computers and Internet | Leave a comment

Feature Priority Levels (Originally Posted 1/8/07)

Technorati Tags: ,

For years I’ve heard UI niceties referred to as "Chrome".  "Chrome" implies that something isn’t essential to how the product functions, but is just there to improve the "look and feel".  Recently I worked with a guy who categorized features based on the components of a hamburger.  The joke there is that while everyone wants to work on the meat of a project, sooner or later someone’s gotta make the bun.  Putting that into software project terms, everyone wants to work on the business layer, and no-one wants to work on the boring, repetitive maintenance interfaces.

I don’t think a single, shiny word properly conveys an appropriate stratification of how important something truly is.  I have therefore created my new list for prioritizing project features.

  1. Essential
  2. Enhancement
  3. Chrome
  4. Bling
  5. Doilies

Obviously "Essential" items must go in.  These are the main project features.  Without these, there is no product.

"Enhancement" items might be something like a wizard interface to make common tasks easier.  Maybe you could survive without them, but you wouldn’t want to.

The standard "Chrome" entry is still there, but we can now divide off the truly non-essential showy things.  For instance, a context-menu or shortcut key might be considered an enhancement.  In the right hands it can make the user more productive, and has become somewhat of an expectation these days.  It’s not going to hold up the beta testing, though.  This is a back-burner item, but should be there before the product ships.  I think tooltips might go at the chrome level, although it’s easy enough to argue that they are standard expectations.  They’re not essential, and once a user is used to the application they’ll never be used again.  Then again, tooltips provide essential information for new users and help get them up to speed on an unfamiliar application.  During development you must remember that there’s always the possibility that the Essential or Enhancement features that the tooltips would support might be refactored, removed, or redesigned before the product ships, so they should be put on hold until the main product features are in place, and fairly solid, or you could end up wasting a lot of effort on things that get yanked later on.

"Bling" might include items that affect a product’s overall "feel".  Things like a unified set of high-color icons on the toolbar.  "Bling" doesn’t add any usability to a product, but gives it an overall feeling of quality.  It’s not terribly important, but it’s something someone should probably pay attention to before the product ships.  An example of bad Bling is TOAD, the third-party "Enterprise Manager" for Oracle.  Let’s not even get into what an Essential-level tool this is, and how many million points off Oracle gets for not writing it themselves.  Let’s just examine TOAD on its own.  16 color icons in this day and age?  Oh, and they’re not ALL 16 colors, there are some 256+ color icons on the toolbars, mixed in with the 16 color ones.  The 16 color icons make me feel like I’m back in Windows 3.1.  The 256+ color icons are more "normal".  The fact that the two are mixed up together makes me feel that somewhere along the line, someone just couldn’t be bothered to unify the whole UI.  It’s a totally unimportant thing from a usability standpoint, but still manages to leave me with an overall feeling of "cheap".

And last, and certainly least, doilies.  These are items that we can all agree are totally non-important, like the doily under your baklava.  Is there any reason you can’t eat baklava straight off the plate?  No, but they put a doily under there anyway to soak up the extra juices.  Non-important in every respect, but it makes it look "fancy".  In the software world, this would be things like skinning support.  Skinning support offers absolutely nothing in terms of usability, and in the hands of a poor skin designer might totally ruin the user experience.  The fact that you have skinning support lets your users know one of two things.  Either your product is so solid that your developers had a ton of free time to build the skinning support, or that your priorities are totally out of whack.  Most of the time, it’s the latter.  Don’t even bother adding doily-level features to a product unless you are dead-sure that the rest of the application is absolutely solid.  It makes you look like you’re trying too hard.  It makes you look that much worse if a user finds an essential bug.

Posted in Computers and Internet | Leave a comment

I’m still going to CodeMash (Originally Posted 1/8/07)

Technorati Tags: ,

I didn’t think I was going to be able to go, since I’ve switched companies… but as it turns out

CodeMash – I&apos;ll be there!
CodeMash

This is great.  The whole team from my new project is going, and I get to go with them.  Still not sure about the carpooling arrangements, but I imagine there will be a couple of groups… there’d have to be, right?  So now that the agenda’s finally filling in on the site, I can start making my plans for what sessions to attend.  Anything NHibernate, AJAX, or smart-client is on my list, with the gaps being filled with whatever seems most relevant to my new assignment.

This oughta be good.  I hope it catches on, and maybe next year they’ll do it again.

Posted in Computers and Internet | Leave a comment

Distraction (Originally Posted 9/26/06)

Okay, this is fun.  It’s like Inkball meets the Dismount games.

http://www.deviantart.com/deviation/40255643/

Posted in Games | Leave a comment

This society is doomed (Originally Posted 6/26/06)

Man you’d better hope the bird flu doesn’t make it’s way to America ’cause I’ll tell ya’ somthing.  As a society we are NOT equipped to handle it.

I’m working at the department of health these days, and even though there are signs on the bathroom walls tutoring you on proper hand-washing technique I’m still blown away by the sheer obsessiveness of some people’s routine.  I just now (Okay five minutes ago), was in the restroom, and the dude in there ran out a good tail of paper towels from the dispenser THEN washed his hands, left the water running, dried his hands, and then used the paper towel to turn the water off.  Oh, and then he used the automatic door opener to leave the room.

I could understand this if you’re a "bubble boy" but come on, people?  Do you really expect your immune system to protect you if you let it sit on the couch watching Oprah all day (Or whatever the closest equivalent metaphor is for an immune system).  And I bet you this is the same kind of person that uses a significant portion of their PTO each year for sick days.  Yes I wash my hands, and no I don’t avoid touching all surfaces for the rest of the day.  I hardly ever get sick, and when I do, it’s a big, strong virus, not the wimpy kind you find on the door handles.  If something knocks ME down, it’s the sort of thing that takes out half the city and ends up on the news.  (Yes, I’m aware of just how badly I’ve jinxed myself)

The whole family’s been sick for weeks.  My daughter hasn’t fully recovered in the last month and a half.  You know how long I had a cough?  TWO DAYS!  I wish I had the George Carlin quote to post here, but it’s on the "You’re all diseased" album, and it rocks.  I’m living by George’s example, and even if the bird flu is the next black plague and it wipes out all of humanity, I’ll outlive these executive wussy-boys long enough to loot their mansions and take their beemers out for a joy ride before I croak.

Posted in Society | Leave a comment

Speaking ill of one’s neighbors (Originally Posted 11/8/05)

I try never to speak ill of my co-workers, but since I don’t actually work with the gentleman to my left, I can let it fly.  I’ve used my headphones more to drown him out than for actual entertainment for quite a while now.  Let’s call him Mr "O".  I’ve been forced to bear witness to his constant trials and tribulations for the last two months, and I just can’t take it anymore.

I’ve sat and listened to him try to get himself a new laptop out of the client because his won’t connect properly to the wireless network at the hotel, thus allowing him to work without actually coming in.  Also his attitude clashes with the Radio shack guys who sold him three defective wireless routers in a row.

I witnessed him getting a whole new cell-phone because his old one wouldn’t get a signal here.  Here’s a tip… ask the people around you if THEY have a signal before plunking down the cash for an upgrade.  We ALL have no signal down here in the basement.  The phone only rings when he’s not here anyway so that we get to listen to his ENTIRE ringtone over and over and over.  Then he comes back and tries to use the thing to make a call every fricken time, like suddenly today’s the day he’ll have a good signal at last.

I’ve listened to him on the phone with Oracle every day for the last week and a half to the point where today it sounds like they are asking him not to call them anymore.  He didn’t threaten lawyers this time, but said he would inform his manager about their unwillingness to help.  Perhaps a restraining order is next.

I’ve heard him trying to procure a larger monitor by playing the "Oracle guys can’t see my whole screen" card.  This one I can understand, I couldn’t live on a 14" 800×600 screen either, but surely you could buddy-breath off the monitor four feet to your left long enough to get off the phone with Oracle and get on with your so-called life.

I sat through a weeklong tirade with the aforementioned hotel about his expensive cashmere sweaters that went missing after a fire drill.  I heard him threaten to bring out the lawyers when the hotel said they are not responsible for items left in the room.  I’d be mad about it too, but a week after he changed hotels, someone called to say the sweaters had been found down the back of the dresser drawers.  We wouldn’t know about any of this if he hadn’t shared it with us, of course.  Imagine the gall of these people trying to make it look like his expensive cashmere sweaters weren’t stolen after all!!!  Good thing the maid put them back in the room after he left, eh?

Right about now I’m thinking that he is about the most clueless waste of space I’ve ever had the misfortune of sharing air with, and I’m going to have to rip my entire CD collection in order to survive it any longer.  Why couldn’t he have had a bigger monitor and no phone?  That would have suited me just fine.

Posted in Work | Leave a comment

My crackpot Harry Potter theories (Originally Posted 9/1/05)

WARNING SPOILERS WARNING SPOILERS WARNING SPOILERS

If you haven’t read "The Half-Blood Prince" then stop reading

now, for spoilers await ye’ with nasty sharp pointy teeth.

WARNING SPOILERS WARNING SPOILERS WARNING SPOILERS


Aunt Petunia is a witch (certainty 60%)

  • We already know she’s been in touch with Dumbledore, possibly more than once. 
  • She knows waaaay too much about the wizarding world. 
  • She shares a bloodline with Lily Potter, and according to J.K. Rowling’s OWN FAQ she is not a Squib.

So, if she’s got magic parents, and she’s not a squib, then she must be a witch.  Why then, is she living as a muggle?  I think she’s perhaps sworn off magic in an effort to hide out from Voldemort, or at least not draw attention to herself.  Perhaps she even had her memory modified so that she wouldn’t know she was a witch, but over the years her memory is repairing itself.  Who knows?  The important thing is that you can logically conclude that she was a witch.  Now it’s possble that she’s only Lily’s HALF-sister, and does not in fact have any magic parents, in which case my theory is shot.  Since we know that Lily was called a "MudBlood" by Snape from his own memory in the pensieve, we can assume that one of her parents was a muggle.  It’s probably in there somewhere, but I’m not sure off the top of my head.

CORRECTION:
After some minor amounts of research on a couple of fan sites, I see that Lily was muggle-born, so BOTH her parents are muggles.  But there must be some wizard blood in her history SOMEWHERE for her to be a witch, so maybe Petunia has it too.  We’ll see.


Dumbledore is alive (certainty 80%)

  • Dumbledore’s too smart to trust Snape without a pretty f’n good reason.
    Everyone in the world thinks Snape’s a sleaze, but Dumbledore won’t here a word of it.  Also McGonagall said that his reason was "ironclad" although she did not know it herself.
  • He completely froze Harry rather than telling him to hide.
    If Dumbledore didn’t KNOW ahead of time that things were going to get "weird", he wouldn’t have frozen Harry, but simply would have silenced him, or TOLD him to remain quiet while he talked to Malfoy.  Certainly in the time it takes to utter "Petrificus Totalus", he could have said "Silencio" much more easily.
  • Wizards have faked their own deaths before.
    If Pettigrew could do it, then certainly Snape and Dumbledore could pull it off together.
  • Broken spells
    If Dumbledore’s "Petrificus Totalus" spell abated instantly on his "death", why do the death eaters still have to run to the edge of the grounds before disapparating?  Now this isn’t a very strong argument, since you would expect the castle enchantments to be of a slightly more permanent nature when cast.  It doesn’t actually do a lot of good to put up protections if all it takes is eliminating a single person to disable them, but I thought I’d mention it anyway.
  • New concepts in this book.
    To take a jaded, analytical look at things:  Why introduce the entirely new concept of non-verbal casting in this book if it is not of some importance to the story?  Also the idea of the Horcrux.  It’s essential to why Voldemort is immortal, but could also just be handy enough to save Dumbledore’s butt, right?  You have to commit murder to create one, but who says you can’t "Save" the sould you’re killing at the time, eh?
  • Snape’s combination of abilities.
    Snape is an "Accomplished Occlumens", able to hide his thoughts from even the other teachers and Dumbledore himself.  So he may just be the only person we know of capable of casting one spell non-verbally while speaking the words to another spell, and NOT being detected by those around him.  Snape’s ability as a "legilimens" would have let him read Dumbledore’s thoughts at that critical moment so that Dumbledore could tell him "Do it now!" without having to use some sign or code-word that might have seemed suspicious.
  • And the really jaded answer.
    It’s no fun to have the guy you’ve hated through the whole series turn out to actually be a bad guy, is it?  Where’s the twist in that?  Where’s the intrigue?  Rowling is FAR more interesting than that.
  • But what about the new portrait in the headmaster’s office?
    No-one has ever said that the portraits are only of dead people, and he did have a chance while Harry was off retrieving his invisibility cloak to have set up the painting himself, knowing that this was the night he must disappear.  It could very well be that the portraits exist throughout a headmaster’s tenure so that they may observe and remember the things he’s done.  Also, that portrait never speaks.  It could very well be a moving picture like those in the Daily Prophet, couldn’t it?  And again with the jaded analytical argument.  McGonagall makes what is only described as an "odd movement, as if steeling herself" when she sees the new portrait.  It’s not like Rowling to barely describe things like that unless she has a very good reason for it.

I’ll add more as I think of them, but these are the two most interesting areas of thought for me right now.

Posted in Entertainment | Leave a comment

The “Hot Topification” of the underground. (Originally Posted 8/16/05)

You wouldn’t normally know it from the workplace, but I’m not what you’d call a Top 40 kind of guy.  I’m what you might in general call a punk.  Specifically, I always considered myself a "CyberPunk" back when the term was in vogue, but now I’m what they call a "RivetHead".  Time passes, terms change.  Hey, I remember when "Techno-Industrial" meant something.  It meant "James Brown is Dead" and "Spice".  Those days are gone.

Today, the label seems to be "Goth", and it’s what all the little kiddies want to be.  Gone are the days when the punks and Goths were the outcasts.  It’s the new "In" thing, and I personally blame Hot Topic.

Posted in Society | Leave a comment