Ryan Lowe gets the value of dynamic languages after working in Ruby. Of course, I think Smalltalk is better :)
I had another look at the NetResources library, and I found out that it wasn't as flexible about setting the user agent as I'd like. I changed that, added a setting to my build, and now it's up - in the development downloads. The last few revs have been advertising themselves as Mozilla compatible, and there turns out to be a problem with that - at least one site I subscribe to classifies that kind of agent string as robotic. So, BottomFeeder is back to properly identifying itself, now that the right API is being used.
Over the weekend we went to see "Harry Potter and the Goblet of Fire". I really liked it - it started a bit slowly, I thought - but once it picked up steam it was "hold the edge of your seat" good. I thought they did a good job of portraying the interpersonal difficulties the characters were having in that book - something I had wondered about, given the action centering around the three challenges. The climactic scenes were well done - pretty much just as I had imagined them.
In my opinion, this is a movie well worth going to - run, don't walk.
Sometimes, being too aggressive about handling exceptions can be worse than not being aggressive enough. Over the last couple of months, Michael has been tweaking the NetResources library (it's in the public Store) to deal with some locking issues in the caching code. Last month, there was a change made that caused a few problems - it started trying to handle exceptions that it shouldn't. Here's the relevant code snippet:
response := [client executeRequestDo: [:connection | client getNotifyingResponse: connection]] on: (self class httpExceptions, Error) do: [:ex | self behaviourForException: ex. nil]. response ifNil: [^self].
The #executeRequestDo: message send executes the HTTP request (or not; it actually implements conditional-get). When you make an HTTP request, any number of bad things can happen - you can get timeouts, other network errors, redirects - if you don't get a response object, you'll get an exception. Some of those exceptions (like redirect) can (and usually should) be resumed - while others (server error) most certainly should not be. Still others - timeouts - can be resumed or restarted, but the decision is an application level issue.
Note the exceptions being caught - #httpExceptions is a list of http and network level exceptions, and Error is a catch-all. The logic is in #behaviorForException:. The mistake was in the old version of this method, and in the old version of the handling code. In the previous version of the above, the code was catching Error (i.e., pretty much everything, undifferentiated). Here's the handling code:
behaviourForException: ex ex class = Security.SSLBadCertificate ifTrue: [Security.X509Registry default addCertificate: ex parameter parameter]. (self class possibleTimeoutExceptions includes: ex class) ifTrue: [^self class triggerTimeoutEvent: url]. (ex isResumable and: [self class exceptionsWeShouldResume includes: ex class] ) ifTrue: [ex resume] ifFalse: [self reportTheErrorType: ex]
What that does is check the sort of exception we got, and then handle it based on that information. The old code just checked whether the exception could be resumed and then did so; that led to situations where BottomFeeder would report low level socket errors - the code being resumed was in no state to be resumed. The relevant check is not only whether the exception could be resumed, but whether it should be resumed. In this case, that check is a simple check against this list:
exceptionsWeShouldResume "which ones should we actually resume?" ^Array with: Security.SSLBadCertificate with: Net.HttpRedirectionError.
Note that the handler stuffs the certificate, which allows us to resume. In the original caller (excerpted at the top) there's logic for dealing with a redirect, so that gets dealt with. Other errors are presumed to be transient, and simply reported. For the purposes of an application like BottomFeeder, where we'll try to read a feed every hour (or whatever the interval has been set to), there's no reason not to ignore most errors. The only additional handling - which is done at the feed level - is to check the response to differentiate between a 404 (presumed to be transient) and a 410 (permanently gone). In the latter case, the app automatically disables the feed in question.
The bottom line is this - you shouldn't mindlessly resume exceptions. It's as sloppy as swallowing MNU errors, and gets you into states that are really, really hard to diagnose.
The legal blowback over Sony's DRM escapade continues: California was already suing, and now Texas and the EFF have joined in. This is going to end up being a very, very expensive mistake for Sony - and a huge distraction that they can ill afford.
Something for the puzzlewits at the RIAA to consider, if they have enough sense to do so.
Avi has some thoughts about SSE from the versioning/synching standpoint:
If you look at SSE as a versioning system, there’s one somewhat glaring omission, which is that it doesn’t seem to handle the repeated merge problem that plagues CVS. In an SSE context, that would happen when, for whatever reason, you try to re-sync with the same set of concurrent changes for a second time (possibly because you’re syncing with a third party who got them from the same place you did originally). In the SSE spec as it stands, these would get marked as conflicting a second time (and potentially a third, and a fourth…), even though the user presumably already resolved that same conflict when they first saw the changes.
It's a good (and constructive) critique - you should read the whole thing if XML formats are of interest to you. My concerns are more pedestrian - in looking at the spec, I see a two duplications between SSE elements and core RSS elements that concern me:
- link: "A required, URL attribute. The URL for related feeds". Ok, how and why is that different from the main (channel) link element?
- id: "A required, string attribute. This is the identifier for the item. The ID MUST be globally unique within the feed and it MUST be identical across feeds if an item is being shared or replicated as part of multiple distinct independent feeds.". Hmm - sounds an awful lot like RSS 2.0's GUID to me. This should be loads of fun in aggregated feeds when they differ.
Overall, it looks ok. Those two duplicates are a bit worrisome though.
With the launch of the XBox 360, Microsoft gets a leg up on Sony - a year's lead time to woo game developers and customers. Meanwhile, Sony is still trying to figure out what hurts, and why.
I guess it's cool that you can download the entire Wikipedia in XML, but wow - that's going to be one huge set of XML docs :)
Hypothetical Labs has an interesting take on what you should learn and why. I agree with the general principle being espoused, but I think he misses the static/dynamic divide. In any case, it's an interesting read.
“The problem with the SonyBMG situation is that the technology they used contained a security vulnerability of which they were unaware. They have apologized for their mistake, ceased manufacture of CDs with that technology,and pulled CDs with that technology from store shelves. Seems very responsible to me. How many times that software applications created the same problem? Lots. I wonder whether they’ve taken as aggressive steps as SonyBMG has when those vulnerabilities were discovered, or did they just post a patch on the Internet?”
I can think of a lot of words to describe the way Sony acted in this case. Sadly, "responsible" isn't one of them. This statement from them kind of says it all:
Sony BMG’s Global Digital Business President Thomas Hesse: “Most people, I think, don’t even know what a rootkit is, so why should they care about it?”
Yes, they stepped away from that. After getting slapped repeatedly.
Paul Graham has a lengthy article on Web 2.0 up; I liked the article, and it's well worth reading in its entirety. I wanted to pick out a few segments that really resonated with me. First, on "Democracy" and the web:
The second big element of Web 2.0 is democracy. We now have several examples to prove that amateurs can surpass professionals, when they have the right kind of system to channel their efforts. Wikipedia may be the most famous. Experts have given Wikipedia middling reviews, but they miss the critical point: it's good enough. And it's free, which means people actually read it. On the web, articles you have to pay for might as well not exist. Even if you were willing to pay to read them yourself, you can't link to them. They're not part of the conversation.
I've said that about WikiPedia before, and it's true about a lot of things on the web. Graham makes an even better point about content that lives behind a pay wall on the web:
On the web, articles you have to pay for might as well not exist. Even if you were willing to pay to read them yourself, you can't link to them. They're not part of the conversation.
Which is what's happened to the New York Times since they created "TimesSelect". No one links to them (for that matter, few people link to Salon, either). The ironic thing about the approach taken by the Times is that they've decided that you should pay for the commodity material - opinions (whether it's politics, sports, food, theater, whatever). I don't know whether the Times has noticed, but the web is awash in opinion content. The kicker is - their columnists just aren't good enough to warrant payment. It's not just them; almost no one is. Graham nails this in his piece:
My experience of writing for magazines suggests an explanation. Editors. They control the topics you can write about, and they can generally rewrite whatever you produce. The result is to damp extremes. Editing yields 95th percentile writing-- 95% of articles are improved by it, but 5% are dragged down. 5% of the time you get "throngs of geeks."
On the web, people can publish whatever they want. Nearly all of it falls short of the editor-damped writing in print publications. But the pool of writers is very, very large. If it's large enough, the lack of damping means the best writing online should surpass the best in print.  And now that the web has evolved mechanisms for selecting good stuff, the web wins net. Selection beats damping, for the same reason market economies beat centrally planned ones.
That's the problem that sites like the Times and Salon are raging against - there's simply too much "good enough" (not to mention the occasional really good) stuff out there. Why pay for opinion pieces? The funny thing is, The Times has something of value that the mass of the web can't do - actual news reporting. Anyone can do analysis and opinion mongering (not necessarily well, but you get my point) - but the Times is one of the few organizations with real global reach. They have (either directly or by agreement) people "on the ground" nearly everywhere - if an event (like the east asian earthquake, for instance) takes place, they can get news out faster than nearly anyone else. Meaning, that content actually has value. I don't know that they can charge for it, but they should certainly be able to command premium ad rates for it.
I'm doing the next Dev Build for BottomFeeder now - I'll have it uploaded by this afternoon. I've made some modifications to the network error handling - the library we use was catching too many errors and trying to resume (which could cause problems). There are a few other small changes here and there, and an update to the browsing code from SwS.
The whole DRM picture for CD's and DVD's just gets sillier and sillier. Have a look here - if you want to defeat DRM, all you really need is some tape:
Applying a piece of opaque tape to the outer edge of the disk renders the data track of the CD unreadable. A computer trying to play the CD will then skip to the music without accessing the bundled DRM technology.
The problem is not easily solved, either - there are tons of "legacy" players out there, and I can't see the industry being stupid enough to create a "day zero", after which no new CDs or DVDs will work on the older players. Not to mention the difficulty of getting the Linux community to care. Not to mention the fact that - in an international market - it just isn't that hard to get hardware from somewhere else (good luck convincing overseas vendors to avoid a money making opportunity).
Then there's the simpler issue - any noise in that direction would simply accelerate the move to downloadable content. Meanwhile, Gartner is proving that software isn't the only sector that they have a weak grasp on - check this bit of mutton-headedness out:
Gartner predicted that the music industry will start to lobby for legislation that requires computer makers to include DRM technology on their systems.
But the analyst advised that, instead of limiting what users can do with music they have already purchased, record labels should focus on tracking this use.
This would enable a "play-based" model where users are charged a fee based on how they consume music.
Yeah, "phone home" setups go down so well with people. Follow their advice on this one, and you'll have the kind of PR nightmare that Sony is dealing with.
I'm paging through this week's SDTimes, and I come across the opinion pages. Here's an article on software development by Djenana Campara (CTO of a company that sells Q/A tools), talking about how software development is a discipline, not an art.
Ok, I'm interested in that - so I read the article - is she going to advocate a CMMI type approach, an agile type approach... I don't know. After I finished, I still didn't know. I have no idea what she's advocating. She repeated the term "discipline based approach" over and over again, as if that should mean something to me. Sorry, it doesn't. It could mean anything, from a paper driven "fill in the form before proceeding" style to an XP/Agile lightweight set of processes. All by itself though, the term is meaningless. Here's an example:
But to be successful, companies must get their developers to buy into the discipline approach. Coders need to understand how a discipline-based development approach will improve their work life by helping them write code that adheres to corporate guidelines; reducing the monotony of testing for errors, tracking down defects and implementing fixes; and improving their coding quality and productivity.
In addition, by enabling their managers to better gauge the status of a project and the impact of requirement changes, developers will have a better understanding of how realistic project deadlines are.
The payoff? One large development corporation reports that embracing a discipline-based approach has cut the time its developers need to understand an application’s context in half. Another company says that a code analysis that typically required one week now takes only one hour.
It's buzzword bingo, but with a kicker - she won't take the time to explain what the buzzwords even mean. I'd love to know what she's advocating here, but it's impossible to tell.
I just got an email from a Smalltalker in Germany - Deutsche Bahn got some publicity for their rail scheduling application, which just happens to be written in Cincom Smalltalk. The video is in German - here's the info:
Last week BahnTV, the company television of Deutsche Bahn (German Railways), featured a 6 minute video on RUT-K, the VisualWorks based software produced by DB Systems which is used to build all the train path schedules in Germany. The video (in german) is online here.
Multi-threaded software as a scalability answer is apparently harder than most people think. Consider:
With both SQL Server and Citrix Terminal Server installations, HT-enabled motherboards show markedly degraded performance under heavy load. Disabling HT restores expected levels, according to reports from within the IT industry.
I've recommended multiple processes vs. multiple threads for years, but most people figured I said that because VisualWorks doesn't map Smalltalk threads to platform threads. It's not just me though:
"It's ironic," said Ibbotson. "Intel had sold hyperthreading as something that gave performance gains to heavily threaded software. SQL Server is very thread-intensive, but it suffers. In fact, I've never seen performance improvement on server software with hyperthreading enabled. We recommend customers disable it when running Citrix and our software on the same server"
Of course, this doesn't help matters much either:
Earlier this year, Intel hyperthreading was revealed to have a security flaw where threads could find information from each other through the shared cache despite having no access to each other's memory space.
I'd be more interested in knowing whether that flaw can be accidentally exploited to cause problems in an application.
BTW, this brings us around to the EFF, which now claims to be supporting the interests of bloggers. Well they miss this one very basic point. We're inevitably headed for a faceoff with Google, just like the one with the book publishing industry. I think it's pretty clear that the EFF will be defending Google, not us.
If that made a smidgen more sense, I'd call it incoherent. Never mind Google and the book publishing industry (although, I tend to think that's part of the same disintermediation that's happening with music downloads) - what exactly can Google do to bloggers? I suppose they could remove a blog from their search index for reasons A, B, and C but - there's no legal problem with them doing that.
Here in the US, a lot of people get confused about the first amendment. It guarantees that the government will not censor you. It says nothing about your friends, your associates, your employer (etc). Unless you get to the level of defamation or libel (or physical threats), any non-governmental entity can try to censor you as much as it likes.
Go ahead, start wandering around the halls at work yelling "The CEO is a (insert insult here)". See how much free speech you have :)
Update: Time for the tinfoil hat.
Lawsuits were filed regarding the PlayStation®2 computer entertainment system models 30001, 30001R, 35001, 39001, 39010, 50001, and 50010 (“PS2”). The plaintiffs in those cases claim that certain inappropriate “Disc Read Error” messages are displayed. They also claim that some PS2s fail to play and that some of them cause damage to CDs or DVDs during playback. The companies that were sued, including Sony Computer Entertainment America Inc. (“SCEA”), say they did not do anything wrong and there is nothing wrong with the PS2.
With a settlement, you have no idea if there was actually a problem - companies settle suits all the time, because the cost of litigation exceeds the cost of paying off the lawyers (did someone say "legal extortion"?).
However, given Sony's recent troubles over the rootkit DRM, this is just more bad news that makes you go "hmmmm..." about purchasing Sony stuff.
Morgan McLintic's post on the artwork in his building reminds me of my general theory about art:
If it could be created by accident by a toddler, it's not art. And that applies to abstract shapes in metal/plastic/what have you - if a toddler could create a small version of it in play-do, it's not art either.
Michael found the problem in BottomFeeder that bunched up some of the marked up text (italics, bold, links) - it was an issue with the way we were using libtidy (which is why people on platforms without libtidy weren't seeing it). The latest code is part of the dev stream updates, and will hit the next dev build - which I'll probably do tomorrow
Well, I won two quick games of PR this morning, which got me into the semi-finals. That was a nail biter. I voluntarily took first position (I wanted a quarry) - and lost 48-47 - just missing the finals. Victoria came up in the late afternoon and tried a game, but didn't make the cut either. We had a good time though - she tried a game of Railroad Tycoon (board game based on the old PC game - go figure), and I got in a round of Carcasonne and Lost Valley.
A decent day, even if we were out later than we should have been. Off to bed!
I think Cisco is trying to get ahead of the curve - they just bought Scientific Atlanta (big provider of cable set top boxes - Comcast uses them, for instance). The follows their 2003 purchase of Linksys, and makes them a pretty big player in the home network space.
Three lackluster Puerto Rico games didn't cut it last night. I did win a round of Settlers, but - since I didn't get there in time for the first round, I didn't have enough points for today's final. Back up to the convention to see if my luck turns.
Time for this week's log report. BottomFeeder downloads stayed strong at 355 per day; here's the platform distribution:
Not too shabby - and 4.1 should be out soon. Next: The HTML page accesses:
|Tool||Percentage of Accesses|
Curious - Mozilla rose back up again. Is that a shift in readers, or a shift in what readers are using? Finally, the RSS Accesses:
|Tool||Percentage of Accesses|
|Net News Wire||10%|
Looks like the relative weight of Mac readers is up on the RSS side. Hmmm
I'm off to Timonium, barely leaving myself enough time to get there for the Puerto Rico tournament. With any luck, I'll have happier news to report than I did last time I did this.
When we got our new Comcast HD/DVR box, we opted to keep the old digital cable box (yes, we like tithing more money to our cable provider :/). The main reason - we wanted the Replay TV to continue recording the (large volume of) non-HD content that comes through the old digital box, while having the (really lame) Comcast DVR record the HD stuff.
That was a fine plan, except that the IR blaster cable broke - that's the cable you hook into the Replay so that IR signals sent to it can be routed through to the cable box. Turns out, a replacement was a mere $6.95 plus shipping away. It just arrived, and it works great! Back to too much TV to ever be able to watch nirvana :)
Joel hits paydirt on the rationale for variable pricing of music - it allows the industry to maintain control:
Theoretically, when a super-duper-blockbuster comes out, like, say, Lord of the Rings, there's so much demand that the movie theaters just end up turning people away. Econ 101 says that they should raise the price on these ultra-popular movies. As long as the movie is sold out, why not jack up the price and make more money?
And why don't they do that? Joel explains that the price sends a signal - if a movie came out with a lower price on first run, it's a massive, public "Thumbs down" on the movie - the likely result would be fewer viewers, not more based on lower price. How does that relate to music?
Now, the reason the music recording industry wants different prices has nothing to do with making a premium on the best songs. What they really want is a system they can manipulate to send signals about what songs are worth, and thus what songs you should buy. I assure you that when really bad songs come out, as long as they're new and the recording industry wants to promote those songs, they'll charge the full $2.49 or whatever it is to send a fake signal that the songs are better than they really are. It's the same reason we've had to put up with crappy radio for the last few decades: the music industry promotes what they want to promote, whether it's good or bad, and the main reason they want to promote something is because that's a bargaining chip they can use in their negotiations with artists.
The upshot - they can hold a gun to the head of artists, threatening them with the lower price. Instead of what happens at the box office - and on the iTunes interface - the industry can continue to promote the artists they want that way, and ditch the ones they don't want. The iTunes system gives end customers much more power over the system - and enables the artists to more reliably gauge their actual worth. As Joel says, that's the last thing the industry wants.
Unless something comes up, I'm going to release 4.1 on Monday. The development build on the site is what I'm running, and it looks pretty good - I've not had trouble since I've been using it.
The New York City STUG is meeting December 7th - this is from Charles:
Hi members, I'm a bit late with sending this off and since I'm taking off for the weekend I thought it best to at least provide a heads up on our next presentation. Here it is:
Date: Wed., December 7th, 2005
Time: 7pm , open house at 6:30pm
Place: same as always, check out the wiki for directions
Topic: VisualWorks and Algorithm design
Speaker: David Siegal
David has been working on some algorithms involving measuring distances between strings and which can also be applied to issues such as DNA analysis. The implementations are in VisualWorks Smalltalk.
I'll update our wiki on Monday with David's abstract and bio. Here is the link for the wiki.
In the MS world, they have you creating a list of tests you think you'll need, then using VisualStudio to generate your code, then using VisualStudio to generate tests. Then you compare the tests to the list, and add any that are missing.
Michael Feathers has a few issues with that:
The key advantage that TDD gives you is feedback. You write a failing test case, make it pass, and then you formulate the next test case. The wonderful thing is that when you get to that next test case, you have the feedback from getting the first one to pass to draw upon. Often that feedback will lead you to formulate the next test case in a different way. It could even lead you to produce a different interface for the class you were writing.
The style of TDD described in the guidelines would have us jump ahead and write five, ten, maybe twenty test cases before getting the first one to pass. You can do that, but it's like putting on a set of blinders. When you don't have to formulate the next test case right after passing the last one, there isn't much chance to think about improvements. Worse, there is a disincentive to thinking about them: if you find any, you have to delete all of the speculation you've hardcoded in the tests and interfaces you created in advance.
What it sounds like to me is this: The VS team tried to warp TDD so that it hit the strengths (as they see them) of their tools. I wouldn't call what they came up with TDD though.
Rod Serling, through the magic of old footage and computer work, will introduce the Nov 21 episode of "Medium". They are having an actor provide his voice though. I think we'll see the ability to have the original actor - via sampling of audio of them speaking - provide their own voice in the not too distant future.
Makes you wonder whether we'll eventually see "new" Ginger Rogers/Fred Astaire dance scenes...
Smalltalk work, looks like it's in Miami - help upgrade an application from VW 2.5.1 to VW 7.
Scoble and Shel Israel make a good point about blogging - you can't force it:
He and I met the other day and he told me that he is hearing lots of companies thinking about integrating blogging into their marketing plans. You know, treating it as yet another marketing “chore” that teams need to do to be able to ship. He thought that idea is lame and will end in failure for those who approach blogs that way. I agree.
Blogging is writing, when you get right down to it, and you can't really force that. When your organization produces white papers, you probably have specific people who produce them. Sure, they get feedback from a variety of people in the organization, but those people don't each write a paragraph.
The same thing applies here. If you want effective blogging, it needs to come out as a natural voice. Forcing it on a team as part of their day to day job isn't going to result in a natural voice, and - if they are irritated by the mandate - then that irritation will shine through.
Bruce Schneier has a good summary of the mess - and he looks places that a lot of us missed. He points out that the anti-virus manufacturers were (and are) in the tank on this one - even MS, hailed for announcing that their tool would remove the malware, was very, very late to the party:
The story to pay attention to here is the collusion between big media companies who try to control what we do on our computers and computer-security companies who are supposed to be protecting us.
Initial estimates are that more than half a million computers worldwide are infected with this Sony rootkit. Those are amazing infection numbers, making this one of the most serious internet epidemics of all time -- on a par with worms like Blaster, Slammer, Code Red and Nimda.
What do you think of your antivirus company, the one that didn't notice Sony's rootkit as it infected half a million computers? And this isn't one of those lightning-fast internet worms; this one has been spreading since mid-2004. Because it spread through infected CDs, not through internet connections, they didn't notice? This is exactly the kind of thing we're paying those companies to detect -- especially because the rootkit was phoning home.
But much worse than not detecting it before Russinovich's discovery was the deafening silence that followed. When a new piece of malware is found, security companies fall over themselves to clean our computers and inoculate our networks. Not in this case.
That is an amazing thing, and Bruce is the first (that I've seen) to bring it up. Read his whole post - some of the companies have really, really lame excuses. It does end up sounding a lot like collusion.
Tomorrow evening and Saturday I'll be north of Baltimore at EuroQuest 2005, a board gaming tournament. I'll be getting there in the late afternoon, in time for the Puerto Rico tournament. I sure hope I do better than last time, when I completely washed out in round one. Should be fun!
InformationWeek has a roundup on the storm that engulfed Sony over their DRM rootkit over the last few weeks. I've been writing on this one extensively - InformationWeek provides a nice summary of the event. The upshot for companies is here:
"It seems crystal clear that but for the citizen journalists, Sony never would have done anything about this," says Fred von Lohmann, senior intellectual property attorney for the Electronic Frontier Foundation, a cyber liberties advocacy group that has been vocal in its condemnation of Sony and may eventually file a a lawsuit against Sony, in addition to three that have already been filed. "It's plain to me that it was Sony's intent to brush the story under the rug and forget about it."
Alan Scott, chief marketing office at business information service Factiva, said, "I think that we're in an entirely new world from a marketing perspective. The rules of the game have changed dramatically. The old way of doing things by ignoring issues, or with giving the canned PR spin response within the blogosphere, it just doesn't work."
Thomas Hesse, Sony BMG's Global Digital Business President, attempted to do just that by dismissing the online protests. "Most people, I think, don't even know what a rootkit is, so why should they care about it?" he said in a November 4 interview on National Public Radio's Morning Edition. He added, "The software is designed to protect our CDs from unauthorized copying and ripping."
That last paragraph might well have worked a decade ago. All Hesse would have had to do is convince a few journalists from Time, Newsweek (et. al.) that their intentions were good, and it would have been the end of the story. It's not as if most business reporters understand technology - they would have played it as Sony vs. a bunch of black hat hackers (various bad examples like the Morris Worm would have come up), and poof - end of story.
Now, there are plenty of knowledgeable people - like Mark Russinovich - have a platform on which to explain the problem. Others can then echo the original report, linking back to it as a source of authority. Over time, the MSM picks it up, and the story catches fire - leaving the original (third paragraph, above) response from the company in question looking stupid.
Sony learned this the hard way - they spent a couple of weeks taking damage, and - in the process - convincing a lot of people like me that Sony products just aren't trustworthy. A rapid response - would have made the whole thing go away. There are a lot of PR/marketing people out there who simply haven't adjusted to the new reality. If you do something lame, you can't just spin and watch the problem go away as the media gets distracted by something shinier. I rather suspect that no one in a position of influence at Sony was watching the rising BlogStorm, so management just chugged along, confident in their outdated view of how messaging works.
There's a lesson in that for other companies - but I'm willing to predict that an awful lot of them haven't paid any attention to this mess, and will make the same mistake when it happens to them.
Here's another breathless story on the $100 (actually, looks like it will be $200) notebook. There's some cool things about this, including the fact that it can be powered by a hand crank. However, there are a number of simple problems too -
- For the truly poor, access to laptops isn't a solution. Access to clean water is way, way higher on the scale
- Tech support. Ok - you hand out a few hundred in some remote village. What the heck do the new users do when there are problems?
This is a pie in the sky solution, IMHO. It's like deciding to hand out cheap cars, and only later noticing that there are no gas stations for the recipients to use. I understand that the people behind this are well intentioned - but laptops are only useful when there's a hell of a lot of other infrastructure supporting them. The well intentioned folks behind this plan need to aim a lot lower.
PR Opinions is really, really unhappy that the general public has the ability to pierce the PR veil and make their opinions known:
I, like many others, have grown tired and weary of the self-satisfied, holier than thou, "A-list bloggers" who believe they hold disproportionate sway on the matters of the day. This isn't everyone of course but the sooner we call time on this endless circle of self-gratification the better.
Yes, it was much better back in the day when PR pros held all that power, wasn't it? A phrase comes to mind:
"If you can't stand the heat, then get out of the kitchen"
There are a lot of PR folks out there who can't tell that there's no return to "the good old days", when they were the only ones you could hear. Bah.
There might yet be a chance that Civ 4 will actually work on my machine. Apparently, there have been a large number of complaints, and a patch is coming. I'll report the results when I see them
I took a look at the game console business yesterday - and then I ran across this "future history" piece. It's a decent piece of analysis, including a set of caveats at the end, where the author (Aaron Stanton) notes various things that could muck with his predictions.
I found it to be well written, and very insightful. We'll see how it plays out - but I think Microsoft is the big winner here. Nintendo is profitable in the console space, and I agree with Stanton that they'll stay that way. Between Sony's DRM nightmare, the PR screw up surrounding it, and their continuing business problems, I think they drop to number three.
This is the kind of thing you would rather see in the pre-demo tests:
In the programme, Mercedes drives 3 S-classes behind each other. The first car hits the brakes, and the two following cars brake down automatically, to avoid a crash. In theory. In practice, the first car braked, and the two others continued right into the back of each other, resulting in a 3-car pile-up at the test site.
According to a German news site, the blushing Mercedes security engineers soon discovered the problem: The test had been done in a hall which was made of steel. This confuses the radar, and the system doesn't work properly, causing more than £150,000 worth of cars to crumple into each other.
The system "works perfectly in all other circumstances", according to Mercedes. For now, though, it may be worth keeping your foot near the brake pedal.
Reminds me of the "Windows Live" demo. If you plan to give a demo with cameras rolling, it's always a good idea to try a dry run in the same place first.
Ok, this is kind of funny. Troy is complaining about the slow response times at Technorati:
Updating at Technorati seems to be more unreliable than usual the past couple of weeks. Jim says "I hear they're having troubles (again)." I did file a problem report with them, but other than an automated reply a few days back, I've heard nothing. Some tags that seem stalled are updating for other blogs, but not mine (nor Jim's the last time I checked).
I saw that post right after I saw this one from Scoble:
I’m sitting here with David Sifry, founder, and Niall Kennedy, community manager, of Technorati. They just pushed out a major update. Much faster. Much much faster.
Seems faster to me too. I guess Troy's frustration is something the Technorati folks have been aware of.
I loved this CNet article on the security risks of things like USB drives, iPods (etc, etc) - the security people are worrying about the risks of having them in the office:
Connecting the gadgets to work PCs could lead to a number of unwanted scenarios, Laudermilch said. For example, malicious code that crept onto the device at home could enter the corporate network unseen by the firewall or intrusion detection software, he said.
Also, a disgruntled employee could copy confidential information to the device and walk out with it. Classified information on a mobile device could be a business risk even when used by loyal workers, when their gadget is lost or stolen, for example.
As opposed to, say, connecting remotely via VPN and doing the same thing at home. If you have any staff working from home offices (or any staff that ever travels), you already have this problem. Banning devices from the office isn't going to solve the problem - but it will irritate the staff.
The official statement is here - the relevant portion:
We deeply regret any inconvenience this may cause our customers and we are committed to making this situation right. It is important to note that the issues regarding these discs exist only when they are played on computers, not on conventional, non-computer-based CD and/or DVD players.
Our new initiatives follow the measures we have already taken, including last week’s voluntary suspension of the manufacture of CDs with the XCP software. In addition, to address security concerns, we provided to major software and anti-virus companies a software update, which also may be downloaded at http://cp.sonybmg.com/xcp/english/updates.html. We will shortly provide a simplified and secure procedure to uninstall the XCP software if it resides on your computer.
However, there's a section at the end that leaves me deeply ambivalent:
Ultimately, the experience of consumers is our primary concern, and our goal is to help bring our artists’ music to as broad an audience as possible. Going forward, we will continue to identify new ways to meet demands for flexibility in how you and other consumers listen to music.
Based on their attempts thus far, I'd say that the "consumer experience" is the furthest thing from their minds. I still read this whole thing as "darn, we got caught!". Yes, the recall is the right thing to do. The next step is to see how they follow it up.
Wow - from 90's media darling to utter collapse - AOL is losing customers at a rate of 300 per hour!
As of September 30, the AOL service totaled 20.1 million U.S. members, a decline of 678,000 from the prior quarter and 2.6 million from the year-ago quarter. In Europe, the AOL service had 6.1 million members, a decrease of 98,000 from the previous quarter and a decline of 170,000 from last year's quarter.
I have a slight quibble with the author of the linked piece though. It's not so much that they are doing anything wrong, it's that they are selling buggy whips as the automobile era is hitting its stride.
About the MVP award, that is. Contrary to what he seems to think, a guy (Ortiz) who is solely a DH simply can't be compared to a guy who plays an actual position. Rodriguez not only hit a lot of homers and drove in a lot of runs - he stole a lot of bases and played a great third base. Ortiz almost never plays the field (and has hands of lead when he does), and is a worse base runner than most catchers. Face it Sox fans - he's a one dimensional player.
Now, had the push been for Ramirez - you might have an argument. He's a weaker fielder than Rodriguez, but he is a complete player.
Wow - according to a Merrill Lynch analysis, the PS3 is going to be very expensive - possibly double the cost of the XBox 360 (which is going for $299 or $399):
According to Merrill Lynch, a financial management and advisory company, because of the cost of hardware componets, the PS3 may cost twice as much as the Xbox 360 by the end of 2006. The report includes an estimated breakdown of hardware costs by component and shipping predictions. Manufacturing costs has been the main factor. Here are some quotes from the report.
"…The PS3 will not only be significantly more costly than Xbox 360 at launch, but will continue to operate at a cost disadvantage for several years. … We think that the Xbox 360 could be selling at half the price of PS3 in the latter half of 2006."
So - not only are they giving Microsoft a year to itself in this space - they are going to have a serious price disadvantage as well. That's going to be a rather large problem, I think.
Looking at the current game systems, the GameCube retails at $99. For a large segment of the target audience, that qualifies as an impulse purchase (psychologically, getting under $100 does that, I think). The PS2 and the XBox are both at $150, which is up - but not that far up. Sony has had the advantage over MS for awhile, having hit the space before them, and having a larger batch of games.
The new XBox 360 price is too high to be a pure impulse play, but it's within the "we'll just get one big present" area for Christmas. Double that price though, and you are past the price of entry level PC's, for gosh sakes - and into serious "can we afford it" discussions between spouses. If Merrill Lynch is right, Sony is going to have to sell at a huge loss just to get in the game.
Via Digg, I found this map of spread of vulnerabilities from Sony's rootkit DRM. Now, consider the civil (and possibly criminal) liability they have - across multiple jurisdictions. Sony's going to be paying for a lot of legal help.
Sci Fi Wire reports that Brando's movie career hasn't been stopped by his death - he's playing Jor-El in the upcoming Superman movie:
Bryan Singer, producer-director of the upcoming Superman Returns, told SCI FI Wire that he used every trick in the book to resurrect the late Marlon Brando and include him in the film as Jor-El. Brando played the role of Superman's father in director Richard Donner's original 1978 Superman movie. He died in July 2004 at the age of 80.
To recreate Brando's version of the character in his new Superman movie, Singer said in an interview that he used "a combination of unused footage, [used] footage and recreated footage. You won't necessarily see Marlon Brando walking around or reanimated in a conventional sense, but you will hear [dialogue] that you have heard before [and] takes that you haven't heard before and a rendering that is completely new."
A few more years of software work, and I fully expect to see brand new flicks starring classic stars.
There have been a bunch of posts about base.google.com - so I figured I'd take a look, now that it's up. What is it? It's a simple online database with basic tag support. It's a bit like an extended version of flickr and del.icio.us - you can tag anything (not just photos), and you can upload the things you tag to Google's servers. It's an online, outsourced database without SQL or RDF, to be really brief about it. The real question is whether people start building applications with it.
Aha! I just stumbled on Google's take, and there's this:
Right now, there are two ways to submit data items to Google Base. Individuals and small website owners can use an interactive user interface; larger organizations and sites can use the bulk uploads option to send us content using standard XML formats. Rather than impose specific schemas and structures on the world, Google Base suggests attributes and item types based on popularity, which you can use to define and attach your own labels and attributes to each data item. Then searchers can find information more quickly and effectively by using these labels and attributes to refine their queries on the experimental version of Google Base search.
If you point your browser here, you'll find instructions on the bulk upload procedures. Looks like they support tab delimited, RSS, and Atom formats.
Update: Well, as it happens, there's no supported driver for my system (A Thinkpad R51) with a Radeon 7500. So, it seems that Civ 4 is just not going to work until a new driver comes out - if it comes out.
If you visit this page, you'll find a rather complex set of instructions for getting Civ 4 installed and running. I found a far simpler way. Right click on the screen, go to properties, advanced, then to the driver tab. See that "Update Driver" button? Try that. Worked for me, without all the back and forth described on the linked page.
Update: Well, maybe not. The game runs, but with hosed up graphics. Sigh.
Well, I'm suffering through take two on the Civ 4 install. In the meantime, I figured I'd run Bf over here on the Mac mini - which is where the post is coming from. The biggest issue? I'm so used to the laptop keyboard that a full size one throws me.
Here's some good news from Sony - they say that there will be no DRM on the PS2 or PS3 disks that would prevent you from playing a game on arbitrary Sony consoles:
However, in order to give an official answer to an already increasing wave of unrest among the gamers, Sony Computer Entertainment Europe has stated, through one of its' spokespersons, that "this is false speculation and that PlayStation 3 software will not be copy protected to a single machine but will be playable on any PlayStation 3 console".
Seems that someone learned a thing or two from the savage beating they've taken over the last few weeks.
Oh, and just to make sure that yesterday was extra glorious - we set up a splitter for the new HD capable cable box and the old one, so that the ReplayTV could stay hooked up the old Cable Box (it can't deal with HD content anyway, and the new box is a basic DVR anyway). That was all fine, except that - somehow - in the process of unhooking and hooking everything, I broke the IR blaster (a cable that sends the IR signal from the replay to the cable box).
Fortunately, Google knows all. A quick search located this page, and I now have a new cable on order. With luck, we'll be back in business in a few days.
I've been hearing good things about Civ 4, so I bought it at Best Buy last night. On top of the server problem I had last night, this purchase made for a huge headache. Installation went smoothly enough. Then I tried to start the game.
Time passed. And passed. And passed some more. Finally, a screen with a globe arrived. Keyboard and mouse input was completely ignored for a long while, but finally I got a spinning cursor - and the program just poofed. No error messages, nothing - just poof.
Joy. I hunted around, and found this page - which tells me that I need to uninstall my display drivers and then update them before I can play the game. Now, call me crazy, but that seems a little extreme just to play a game. Do I really need to jump through those hoops?
Apparently, one of the survivors in marketing at Sony managed to find management and drop a clue on them - they are pulling the bum CD's off the market:
Sony BMG Music Entertainment said Monday it will pull some of its most popular CDs from stores in response to backlash over copy-protection software on the discs. (Related item: Firestorm rages over lockdown on digital music)
Sony also said it will offer exchanges for consumers who purchased the discs, which contain hidden files that leave them vulnerable to computer viruses when played on a PC.
"Sony BMG deeply regrets any inconvenience to our customers and remains committed to providing an enjoyable and safe music experience," the company said. Sony says more than 20 titles have been released with the XCP copy-protection software, and of those CDs, over 4 million have been manufactured, and 2.1 million sold.
Some of the management meetings at Sony must have been utterly fascinating over the last few days, as they slowly worked their way around to doing the right thing.
The outage we had last night left a few artifacts lying around that gave the server trouble last night and this morning. I'm pretty sure I've got those sorted out now, but we'll see how things go.
If you grabbed the BottomFeeder dev build, you have probably seen a problem - images don't always display on the first selection of an item. This is a problem with the latest releases of the SwS component I use, so I've stepped back to a known working version, and I'm doing a new build. I'll let that simmer for a few days, and - if there are no issues - release 4.1. Fingers crossed :)
Sony's response to the DRM scandal continues to be counter-productive at the PR level, and inept at the technical level. Have a look at this report from Freedom to Tinker:
Alex Halderman and I have confirmed that Sony’s Web-based XCP uninstallation utility exposes users to serious security risk. Under at least some circumstances, running Sony’s Web-based uninstaller opens a huge security hole on your computer. We have a working demonstration exploit.
We are working furiously to nail down the details and will report our results here as soon as we can.
I thought at first that they had replaced all the PR and marketing staff with lawful evil lawyers (D&D reference there :) ). It looks like they replaced their technical staff with zombies at the same time.
Why, they get cancelled. Sci Fi Wire reports that Kolchak still gets no respect:
ABC has canceled Night Stalker, the second time the network has given the axe to a series about reporter Carl Kolchak and his pursuit of supernatural phenomena, Variety reported