For all of you who still enjoy 7th grade humor, I give you... the ship.
Theater owners are solving the wrong problem - they want to jam cell phones:
With flat panel prices in free-fall and sizes approaching those anaemic megaplex screens, it’s no wonder the National Association of Theater Owners is in a panic about the decline in consumers willing to slap down a near-sawbuck to watch a flick. What to do? Well, it looks like cellphone jamming is the new baby jeebus in an industry attempt to lure your azz back into their buttery seats. See, NATO (er, yeah) has announced plans to petition the FCC for permission to jam cell phone signals within theatres to “block rude behaviour”
Would actually making the audience aware of a rule (silent/vibrate/off) - and enforcing it wit expulsion if need be - be just too hard? I don't typically have to worry about emergency calls during a movie, but some people (doctors, for instance) might not have that luxury.
Maybe theater owners should ask Hollywood to create a better product instead.
Via Rob Fahrni - which hero are you? Looks like I'm:
| You scored as
Captain Jack Sparrow . Roguish,quick-witted, and incredibly
lucky, Jack Sparrow is a pirate who sometimes ends up being a hero,
against his better judgement. Captain Jack looks out for #1, but he
can be counted on (usually) to do the right thing. He has an
incredibly persuasive tongue, a mind that borders on genius or
insanity, and an incredible talent for getting into trouble and
getting out of it. Maybe its brains, maybe its genius, or maybe its
just plain luck. Or maybe a mixture of all three.
Which Action Hero Would You Be? v. 2.0
created with QuizFarm.com
How in the world can anyone think it was a bad year for the movies when so many were wonderful, a few were great, a handful were inspiring, and there were scenes so risky you feared the tightrope might break? If none of the year's 10 best had been made, I could name another 10 and no one would wonder at the choices. There were a lot of movies to admire in 2005.
A box-office jolt from the magic kingdoms of Kong, Narnia and Hogwarts will close Hollywood's year with some holiday cheer, though not enough to offset the biggest decline in movie attendance in 20 years.
What Ebert isn't seeing is that lots of the "critically acclaimed" movies are movies that don't get much box office. I think the movie industry - and Ebert - might be able to learn something from a thing Sam Goldwyn supposedly said: "If I want to send a message, I call Western Union".
And please, enough with "risky" movies and "edgy" scenes. More acting, more story lines, less commentary. And whatever you do, keep George Lucas away from the script.
Rogers' exploration of the editing practices of Jim Wales makes the same point I made awhile back - topical content is going to oscillate for a long while - especially when the content is covering something on which people disagree. Biography pieces on living people definitely fall into that bucket :)
One reason often quoted for learning Smalltalk is that it is ``pure'' and thus force people to think and program ``object oriented.'' I will not go into the discussion about ``purity'' beyond mentioning that I think that a general purpose programming language ought to and can support more than one programming style (``paradigm'').
The point here is that styles that are appropriate and well supported in Smalltalk are not necessarily appropriate for C++. In particular, a slavish following of Smalltalk style in C++ leads to inefficient, ugly, and hard to maintain C++ programs. The reason is that good C++ requires design that takes advantage of C++'s static type system rather than fights it. Smalltalk support a dynamic type system (only) and that view translated into C++ leads to extensive unsafe and ugly casting.
The reverse is true as well, of course. Once you get past a high level view of your system, design will get to be language specific - you'll approach the same task differently in C++, Smalltalk, Java (et. al.). Which language to use depends on a best fit analysis - clearly, I'm of the opinion that Smalltalk is a better fit for most application level tasks. But the larger point is what Bjarne said.
I've tossed out a new Survey on our website - it's short, with a few questions about your tool desires. Let us know what you think, and send me email if you have comments that go beyond the survey.
Another thing that I should mention in the context of this post - and the comment trail after it. On this blog, I do an awful lot of "thinking in public" - I tend to post my first impressions, not after long cogitation on an idea. Which means, my initial take on an idea does not always represent the sum total of my thinking on a subject - and a mostly dismissive reaction doesn't mean that I'm not discussing the idea with our engineers.
Having said that, bear in mind that our engineering team is engaged in a pretty darn big project - we are building a system that is as large and comprehensive as J2EE or .NET, with a fraction of the engineers that Sun and Microsoft have. Sure, Smalltalk is marvelously productive, which is why we are able to stay in the game - but we do have resource limits, and - as such - a large part of what I do is prioritization.
If you ever wondered about where driving habits (as in, which side of the road) come from, here's a background page. Interesting how much influence Napolean had over the process in Europe, especially in the case of Austria.
Read this from Philip Greenspun and you'll understand:
We did three sightseeing/photo flights over New Orleans. The first was with Vincent, who oriented me to the area. The second was with Ernie, who pointed out some additional sights and breaches in levees. For the third flight, we removed the left door of the R22 and left it with the FBO. Tony flew from the right seat while I took photos out the open left side of the helicopter. Flying above the city, you realize what a tough challenge rebuilding is going to be. Some of the high ground neighborhoods are more or less back to normal, with the exception of blue tarps covering damaged roofs. The low-ground neighborhoods, however, whether formerly rich or poor, are deserted. It looks as though a 1970s-style neutron bomb was detonated leaving the buildings and cars, but killing all the people. No homeowner in one of those neighborhoods is going to be able to rebuild without taking on a tremendous risk. What if the other people in his neighborhood decide not to rebuild? He will have spent $200,000+ on a new house in a dangerous abandoned area.
There are two levels of risk involved here. One is the risk the putative homeowner is willing to put up with, based on the level of abandonment. The other is even harder to get around - the level of risk a mortgage lender is willing to deal with in order to fund rebuilding in what's now recognized as flood plain. Sure, it's always been flood plain - but potential lenders have had that fact put in front of them now. Between those two things, I expect that large parts of New Orleans will simply never get rebuilt, regardless of what kinds of aid packages go into the project.
The other night Chris Pirillo recorded a podcast in his house with a bunch of geeks talking about Xbox 360. It’ll give you some sense of what we’re experiencing tonight. I can’t believe the quality. Sorry, Maryam, we’re going to buy an HD screen in 2006. I’ll go more into debt for one. It’s just so freaking cool.
But, the lines outside retail outlets with XBox 360 consoles aren't hype, anymore than the fascination with the Apple Nano is. I think Microsoft might well have a small tornado on their hands here.
It's an unnatural act in December:
Football that matters in December in the Meadowlands. Even better - Looks like Washington wins this week over Dallas. The Giants play the Skins next week, but then they play the hapless Raiders the last week of season. All they need to win the division is one win.
With some help from the IRC Channel - I got pointed here - I was able to get Eclipse running. Seems I had to specify the VM path - there's likely something going on with my search path.
So, there it was - not showing any of the system code. I followed some advice from this thread, and Eclipse promptly crashed with a fairly useless error message. Now, even given my previous post (the chirpy one about Civ 4), it's entirely possible that Civ 4 left my machine in a weird state. On the other hand, VW and BottomFeeder are operating fine, and the sometimes shaky Eudora hasn't crashed.
So my evaluation is still nebulous. Of course, this is Windows, and experience has taught me that "works on my machine and not yours" is common enough. I don't have a new enough Linux box to contemplate running Eclipse (It's a PII 400 - although, I should point out that VW runs just fine there :) ). Anyway, I'm still looking.
Maybe my reboot requirement from the first time I tried the game was something else, because I've played a few times since then without any problems. At first, I disliked the UI (relative to Civ 3), but it's grown on me. I also managed to get in a couple of network games with Michael - timezone issues prevented taking the games to completion, but I was ahead in one, and he had me in the second.
So ultimately, I think I'm now happy with the game.
I think it's clear, based on some of the comments here, that some of the people reading this blog don't get what I was after in that post - and don't get my take on Intellisense, either. Let me start with the Eclipse post. I am honestly curious - I haven't looked at the product in awhile, so I downloaded it. I keep getting what looks like a nonsensical error about a JDK revision that - so far as I can tell - isn't installed on my machine. I'm assuming that I'm supposed to start Eclipse via the 'eclipse.exe' file in the main directory, not via the 'startup' JAR file. Sure, I was snarky in that post - but if you read this blog and you haven't come to expect that, well, I'm not sure you get me yet :)
As to Intellisense - I've seen what the Dolphin guys are doing (which is the same thing as a couple of optional add ons to VW do, for that matter). I've also seen how it works in Eclipse, and in VisualStudio. Personally, I just don't find the feature helpful, but I do know that people differ on this. Here's the thing about Smalltalk though - I stated here that sure - a static language like Java (or C#) will be able to give you more precise information for that kind of feature. What I also said is that, IMHO, the benefit that comes from that is far, far lower than the overall loss of flexibility that those typing schemes also give you.
As other people have said about Smalltalk, it's clay in a developers hands. In contrast, Java and C# are more like balsa wood. Sure, you can build useful things from balsa wood - but it's a brittle material, and you can easily snap things off. Clay is more malleable. It's not a perfect analogy by any means, but it drives at the point I'm trying to make. I understand the benefits that static typing gives at a tool level. It's just that the cost is, in my opinion, far too high.
It looks like Sony's DRM mistake may be causing blowback where the labels really, really don't want it - with the artists:
The battle between artists and Sony BMG over the use of Digital Rights Management (DRM) copy protection on audio CDs just got even more interesting as some more artists have decided to act. This time it is My Morning Jacket, who's album "Z" is copy protected. They are doing their very own recall of the CDs and get this... they are burning unrestricted copies of the CDs themselves and sending them out to fans. Oh I wonder how Sony BMG feels about that.
Let the fun and games begin!
Dare has a post up on a cross-protocol, browser based IM client (Meebo), and the liklihood that they'll be acquired. That raised a question in my mind though - and it's not meant to be a nasty question - I'm honestly curious. What do MS, Yahoo, and AOL get out of their IM clients? I mostly use AIM, but that has to do with which network most of the people I deal with are on. I don't use AOL for anything else, and I haven't paid them for anything, ever. At one point I used MSN as a backup for dialup access, but I gave that up eons ago too, as I started traveling less, and hotels started having broadband access.
So honest question - what actual benefit are these guys seeing from their IM networks?
I'm not sure that Orlowski would know what a fact was if it started stalking him. In an all too predictable hit piece on the accuracy of Wikipedia, he has a lot of fun at Jimmy Wales' expense. I guess he didn't see this piece on the overall accuracy of Wikipedia versus Britannica.
The reality is, Wikipedia's issues with accuracy has far more to do with the controversies on topical issues than it has to do with anyting else. I think I mentioned that awhile back.
Time for my weekly look at the logs. BottomFeeder downloads stayed in their zone, at 355 per day. The breakdown:
Those HP numbers always amaze me - who knew there were so many HP users interested in an aggregator? Off to the HTML blog page report:
|Tool||Percentage of Accesses|
Either my audience switches between IE and Mozilla a lot, or a fair proportion of my audience is variable. There's just not a lot of consistency between the IE and Mozilla numbers, week to week. Finally, the RSS tool distribution:
|Tool||Percentage of Accesses|
|Net News Wire||8.9%|
Still a lot of tool diversity there.
You should compare the way Six Apart dealt with a crisis (yesterday's Typepad outage) with the way most companies deal with one. Unlike, say, Sony (lots of denial and stupidity, finally followed by a grudging admittance of something, but not quite wrongdoing), Six Apart got right out in front of the outage and was extremely transparent about it. Now, I don't want to compare the actual problems - they were nothing alike. I'm talking about the responses.
If you look through the postings around the sphere yesterday, you'll notice that there wasn't much (if any - I didn't see any) outrage, and people seemed to be happy with how Six Apart was treating them. Just have a look here - Anil Dash gave a candid interview on the issue on the day it happened. Not weeks later, like too many companies out there.
I may not use their software (I wrote my own server), but I like the way they do business.
Even though I didn't catch much of the final show, I did think of a way to explain the complex ironies of Howard's humor to those who can't see it through politically corrective glasses: When Howard seems to be making fun of Wendy the Retard , he's having fun with her; and when he seems to be having fun with Daniel Carver the racist , he's making fun of him.
I've never found Stern to be funny. Back when he was married, he was somewhat amusing in a 7th grade bathroom sort of way - mostly from the "his wife lets him do what?" kind of perspective. After the marriage ended, so did that question.
I can hear 7th grade level locker room humor anywhere though, and I don't feel a particular need to seek it out. Another thing - if you have to explain something as simple as what Doc tries to explain above - it's not, in fact, simple. I've watched Stern on TV, and I've listened to him on the radio. It's not at all clear that he's anything other than a misanthropic fool to me, and I don't think I'm particularly humor impaired, or a fan of PC notions. Stern is the classic shock jock, and his shtick has ratcheded up the shock meter over the years as people have gotten used to one shocking set of humor after another.
I've seen real comedians, and while they don't necessarily eschew cursing, they can be funny in almost any context without having to reach back to 7th grade. George Carlin comes to mind. Or the late Richard Pryor.
The last time I downloaded Eclipse, about a year ago, it was slow, but easy enough to install. I figured I'd have another look, since the Java fans who read this blog keep telling me how wonderful it is.
Well, it may well be wonderful. I have no idea though, because I can't get it to run. I downloaded it from eclipse.org easily enough - but then the app told me that it needed JDK 1.4.1, and all I had was JDK 1.3.1. That's where the real fun started
First check - follow the links from the Eclipse sit to the Sun download page - there's JDK 1.4.2. No 1.4.1 in sight. I figured that ought to be close enough, so I grabbed it. That's when I found the marvelous installer that Sun has. Now, I know that the Cincom Smalltalk installer has not always been perfect - but this one did some basic stuff wrong. No cancel button, and it reported progress for awhile, and then just stopped - leaving me wondering whether it was still running. Finally, it finished - and then it asked me to reboot. What's up with that?
Ok, back from the reboot - and Eclipse still tells me I only have 1.3.1 installed. Look at CLASSPATH in a DOS shell - nope, that points to the new stuff. Installed Programs? Nope - no sign of anything with a 1.3.x version anywhere there. Joy. Off to the registry - nope, nothing there.
In desperation, I go into the directory and try starting the file called "startup" instead of the one called "eclipse". Fascinating - that works. But it opens up without any sign that it sees the JDK, so I expect I'm not seeing it the way I'm supposed to. Clearly, I'm missing something here. I have no idea how to even evaluate this thing.
Looks like reality finally caught up with the hype - the public UDDI registries run by IBM, SAP, and Microsoft are being shutdown. Wow, who would have guessed that a system like the CORBA registry - but using XML and HTTP instead of binary mechanisms - would also fail to reach widespread usage? It's almost like we've seen this movie before.
Tim Marman points out that the customer is in control:
My "relationship" with Engadget really illustrates this well. I used to read Gizmodo but switched over when Engadget first offered full-text feeds. Gizmodo has since added full-text feeds, but it was too late: at this point I consider myself a loyal Engadget reader. It's one of the few sites I will read when I don't have my aggregator. Even when I do, I still visit the site a lot to leave comments. I also link to them a lot, and while I may not have Scoble's 18,000 readers, I send some traffic your way - that should at least help make up for the "lost" visits from me, right? ( Oh, and unlike Robert, it's not that I actively refuse to link to sites with partial-text feeds - it's just hard to link when I've already unsubscribed and won't see your content ).
It's really very simple: RSS lets the customer control the conversation. In exchange for that control, we will gladly reward you with our loyalty - and we'll be happy doing it. But if you're going to do this, you can't do it half-assed - don't give us partial feeds!
I'm not as hardline on partial content as Tim or Scoble are, but that's not the point - the point is, customers have a lot more power in the conversation than they used to. Marketing departments are figuring this out very, very slowly.
Just follow the steps outlined by Marketing Sherpa, and you too can have a corporate blog with all the identity of Borg. I mean, have a look at what they suggest:
Before you get in a room with stakeholders to discuss the blogging issue, you must understand what you hope to gain from blogging -- because the direction of that discussion will depend upon your answer.
This falls under their "Set Goals for Blogging" theory. I started this blog back in 2002. Did I have goals? Heck, at the time, I wasn't really sure what I was doing. I started because I thought it might be cool to build a blog server using the Cincom Smalltalk Web Toolkit - I even made it multi-user at first. That stopped, and I eventually invited a community here.
And setting a direction? heck, I don't have one - I post on things that interest me, as they come up. I don't have even a vague plan for what I'll post - it's all ad-hoc and driven by events. I read a lot of content, and a lot of what I read is from RSS searches for keywords of interest to me and my area of business. Step one should be "read what other people are saying about you".
I rather suspect that a "goal oriented" blog would get really dull, really fast - just like the political blogs that stay "on message" get really dull. If you want to sound like a corporate version of one of the "talking points" folks that show up on the political talk shows every night, follow their advice. Just don't be surprised when you end up with something that no one but the trolls cares about. Here's the kicker on that:
In other words, the company's communications strategy should be mirrored throughout the company's blogs -- but they can show more of a person's or company's personality.
Here's a tip: If this is what you think, then don't even bother. Your "blogs" will be the same kind of puffery as the press releases you already have.
The only useful thing in the entire piece is the idea about a blogging policy - letting people know what corporate considers out of bounds on an official (i.e., hosted on corporate servers) blog would be useful - that way, no one will be confused about what the boundaries are. But - those boundaries should err on the loose end.
Hat tip to Scoble.
Eleven years and billions of dollars later, Itanium serves instead as a cautionary tale of how complex, long-term development plans can go drastically wrong in a fast-moving industry.
Despite years of marketing and product partnerships, Itanium remains a relative rarity among servers. In the third quarter of this year, 7,845 Itanium servers were sold, according to research by Gartner. That compares with 62,776 machines with Sun Microsystems' UltraSparc, 31,648 with IBM's Power, and 9,147 with HP's PA-RISC.
I'm not convinced that HP will stay with the chip either. Everything I've read about this chip says that intel allowed a technical idea to run ahead of market research, and they paid the price for it. Read the story - it's a good overview of the problem.
We are going to have a slight hiccup in the winter release - we found that we have to re-package, due to a small installer problem - a "typical" install could lead to confusing results (especially in the non-commercial release). So, we are doing a re-package, and should have that all ready early next week. However, Cincom takes a holiday during the week after Christmas, so that will push the official release out until after the new year.
Cees noticed that MS has finally realized that the graphics engine doesn't belong in the kernel:
Well, well, well. After 15 years, the original NT group is being proven right: Microsoft is moving the graphics out of the kernel again, reversing a decision that lots and lots of people frowned upon when they moved graphics into the kernel between NT 3.51 and NT4. Back then, most people concluded that it was the major reason that NT4 wasn’t just as rock-solid as NT3, and it seems that they’ve finally gotten the message.
I wonder how many kernel hacks ought to be removed now that they aren't trying to speed up Doom anymore...
Update: The new files are uploaded. Just grab the new Mac files
I found and fixed the Mac issue. I'm going to push new builds for that platform, but in the meantime, here's the fix:
- Start BottomFeeder
- Don't select anything in the tree view
- On the toolbar, select the third icon from the left, and download LibTidy
- Restart BottomFeeder, and it should work fine
My apologies for this - if you would rather download an entire new build, I'm doing that now, and will update this post when I have it
The Yankees are interested in Garciaparra? Why? Here's what ESPN has to say:
Garciaparra is believed to be deciding among four finalists -- the Yankees, Dodgers, Indians and Astros. Tellem has not been willing to identify those teams but said Garciaparra viewed all of his finalists as "attractive options," and is "weighing them all carefully."
There have been indications over the last 24 hours, however, that the Yankees and Indians have grown less optimistic about their chances of signing Garciaparra. The Indians are believed to be concerned that Garciaparra won't prefer a job as an every-day right fielder. And the Yankees apparently believe that Garciaparra would prefer to play first base in Los Angeles over a job as a first baseman-DH in New York, in part because he owns two homes in southern California.
Good gosh, the Yankees need pitching, not another first baseman with stone hands.
Troy blogged about the Technorati support for RSS tag feeds - it turns out that you can add support for that directly to BottomFeeder via the search building wizard. Under the Search menu, hit "Search Feed Builder". Then, click on the "Define Builder" button. You can fill in that dialog box like this:
Then, just go back to the search wizard itself, select Technorati from the drop down menu, and enter the tag you want to search for. Don't worry about spaces; the tool encodes that stuff for you.
Ebiquity has taken a look at the ping-o-sphere - specifically, at the pings that hit servers like weblogs.com. The results are not terribly surprising - around 3/4ths of all pings are bogus:
In the next step we used our work on splog detection to detect splogs (and hence spings) among the english blogs. Our detection mechanism is close to 90% accurate. As shown in the charts below pings from blogs average around 8K per hour and those from splogs average around 25K. [ed: follow the link for the charts]
Clearly almost 3 out of 4 pings are spings! Going back further to the source of these spings, we observed that more than 50% of claimed blogs pinging weblogs.com are splogs.
Ouch. That means that any of the services that rely on pings are going to end up using (or, are already using) the same kinds of techniques that email clients and servers use to identify spam. Of course, with that, we get false positives (i.e., good messages getting junked).
So that got me thinking about the piece Doc Searls did awhile back on the net, the carriers, and who charges what. Specifically, I came back to this:
There's nothing wrong with being in the bandwidth business, of course. But some of these big boys want to go farther with it. They don't see themselves as a public utility selling a pure base-level service, such as water or electricity (which is what they are, by the way, in respect to the Net). They see themselves as a source of many additional value-adds, inside the pipes. They see opportunities to sell solutions to industries that rely on the Net--especially their natural partner, the content industry.
They see a problem with freeloaders. On the tall end of the power curve, those 'loaders are AOL, Google, Microsoft, Yahoo and other large sources of the container cargo we call "content". Out on the long tail, the freeloaders are you and me. The big 'loaders have been getting a free ride for too long and are going to need to pay. The Information Highway isn't the freaking interstate. It's a system of private roads that needs to start charging tolls. As for the small 'loaders, it hardly matters that they're a boundless source of invention, innovation, vitality and new business. To the carriers, we're all still just "consumers". And we always will be.
Well, the spammers are also freeloaders. A large part of the problem is the simple fact that - unlike sending physical junk mail - the cost for sending junk email, or setting up splogs (or sending out pings from them) is pretty much zero. Now, I'm not saying that the carriers are "the good guys" - far from it. But one of the complexities that Doc didn't really touch on is why the public - and a fair amount of the technically literate public - might be willing to go along with their vision. Net users are inundated by junk mail, web searches are clogged by bozo results, and real (and reported) virus/worm attacks are perceived to be rampant.
If the carriers get the things Doc is afraid of, these things will be a large part of the reason.
Steve Kelly weighs in on a little brouhaha in the DSM world - his CEO, Juha-Pekka Tolvanen posted a quote from Grady Booch, at an industry panel. Grady responded, both in comments and on his own blog, that he'd been misquoted.
It might have stayed there, except Steve dug up a number of other people at the event who have Booch saying what Juha said he said. Now, Steve is an straight shooter - he's not one to run wild with an accusation - so I'm inclined to follow him on this one.
Hey look - these clowns at Visto who are suing Microsoft think they invented TCP/IP and various mail protocols:
Visto has been at the forefront of developing mobile communications solutions for nearly ten years. Company co-founder Daniel Méndez and others developed the system to allow consumers to securely receive their email and other sensitive data via mobile phones or other mobile devices while traveling. Méndez and Visto went on to patent the system that drives email from personal or business servers to mobile devices like cell phones and allows users to access sensitive data and email stored behind highly secure corporate firewalls.
So if I have Cincom Smalltalk installed on a "Smart Phone", and I use IMAP to access a mail server, am I infringing their patents? Their CEO, Brian Bogosian apparently thinks so. I guess I'll have to watch which devices I use to collect email with in the future. Sheesh, what a maroon.
TypeLess has been a plugin for BottomFeeder for a long time now - but at some point in the last 3 months, a change in some of the XML code broke the way settings got written out. The way this manifested was in a failure to load settings from disk - so you could run TypeLess once, and then it would fail if you saved settings and tried again. Here's the fix if you have seen that:
- Delete the file Typeless.xml in the BottomFeeder directory
- Grab the Typeless update via the upgrade tool in BottomFeeder
- Open TL, and save settings.
After that, it should all work right. Sorry about that.
Looks like Sony has a lot of ground to make up on the PS3 - if this report from Merrill Lynch is correct:
According to a report from Merrill Lynch, published in Japanese magazine Toyo Keizai, Sony is set to lose over US $1 billion on the Playstation 3 in the year following its launch. The report indicates that Sony may be willing to sell the console at less than the production cost - said to be around 54,000 yen (almost US $500) - in an effort to gain a significant share of the market.
That's a huge pile of cash to burn through - they better hope that the negative halo they gained from the DRM mess doesn't bleed over into this space.
Andy Bower (Object-Arts, makers of Dolphin Smalltalk) has some thoughts on being a Smalltalker - here's his summarization of Java:
We started building Dolphin in 1995 and, when Java floated in on the Internet bubble later that year, our disappointment was palpable. It wasn't just that Java was getting very big, very fast and was obviously going to make it harder to sell Dolphin to the masses. It was more that, really, the designers had just missed the point. Or at the very least they had missed an opportunity. Yes, they had a virtual machine and garbage collection, which was more than C++ ever did but what about all that other stuff they could have taken from Smalltalk? What about dynamic typing? What about keyword selectors to aid readability? What about proper Reflection? What about "Everything is an object"? Heck, you couldn't even add two Integers together. It was very sad.
And .NET? Recall that Dolphin is a Windows specific Smalltalk, so .NET was potentially a good thing for them. Except:
With the advent of .NET, I must admit we thought it was all going to change. Here was a virtual machine that was designed to run multiple languages using a common "object" model. Okay, it only ran on Windows but, hey, so does Dolphin so we weren't too unhappy. But where was all that good stuff again? Yes it's all marginally better than it was with Java but, really, writing a performant Smalltalk on top of the current CLR is just impractical.
We saw the same thing here at Cincom. We seriously considered hosting ObjectStudio on top of .NET (and in 2003, Microsoft called me a few times to gauge our interest). The problems are as Andy says - it's just not possible to build a performant Smalltalk on .NET. The JVM is even worse that way. There's reason to be cheerful though:
The great thing from our point of view is that, if you like Ruby you are pretty well sure to like Smalltalk so, hopefully, we're about to get a new influx of dynamic, everything is an object, programmers and that can only be a good thing for Dolphin and for computer science. It's taken 25 years but we're nearly back there.
Yep. After a ten year road to nowhere, the industry might be rediscovering actual progress.
Recently I offered my book, God’s Debris, for free on the Internet, under the theory that the people who like it might be inspired to buy the sequel in hard copy. 170,000 people downloaded it in two weeks. Many of them presumably e-mailed it to other people who e-mailed it to yet other people. I’m guessing half a million people read it in the past month. It’s a love-it-or-hate-it kind of book, so let’s say 250,000 people loved it. That seems about right based on the reviews on Amazon.
His idea was to charge for the follow on book, as kind of an experiment - the results?
I don’t know the exact number, but it appears to be less than a thousand. An alarming number of readers were confused about this whole process and wrote to ask if they could also have the sequel for free.
This is an obvious problem (although it seems to go right past the deep thinkers at Sun, who seem to think that revenues will just flood in if they make everything free). There are a few different reactions to the challenge posed by free competition. On the one hand, you see reactions like the ones from Sun - and I think they'll run into the same issue that Scott Adams did.
The other common reaction is the one pursued by the RIAA - rage against the technology that enables free downloads, and try to be like the little dutch boy, fingers in every dike breach. Almost no one takes the approach that Jobs was smart enough to see at Apple - come up with a reasonable price and download system that encourages people to do things legally without trying to take them to the cleaners.
The Apple approach leads to growth and happy consumers - the RIAA approach leads to stupid stuff like the lawsuits they are using (which alienates customers and prospects), and self defeating DRM approaches (Sony). Funny, then, that Apple is the odd man out in the music business. I guess inertia is a more powerful force than success. Let's see if Microsoft is paying attention - will they stay with PVP-OVM, and end up stunned by angry users?
I tried to make sense of this post from Gillmor, but it was too many non-sentences jumbled together, with the word "Attention" tossed in a few times to prevent me from nodding off. I may be no fan of Office - and I still think that the Ribbon in Office 12 is an utter atrocity - but if this article is what passes for opposing ideas, then the Redmondites have nothing to worry about.
Lispian explains why the one true language theory of software development doesn't work.
Bob reports that Cincom Smalltalk, Winter 2005 Edition, is ready for release. It's in the Cincom release machinery now; I'll report back with expected shipping dates when I have them. As well, once that happens, the NC download will flip to the latest stuff.
The scientific magazine "Nature" has compared 42 articles in both the encyclopedia Wikipedia and the Encyclopaedia Britannica. Experts in their field were given the task to check for factual errors. To the surprise of nature, both encyclopedias were containing similar amounts of errors.
Will the relentless Wikipedia critics get a clue, or decide that this isn't worth noticing?
The music industry isn't pleased with what Apple is doing with the iTunes store - BusinessWeek lets them have a soapbox:
Not necessarily. As has been true since the start, iPod owners mostly fill up their players from their own CD collections or swipe tunes from file-sharing sites. Now legal downloads may be losing their luster. According to Nielsen SoundScan, average weekly download sales as of Nov. 27 fell 0.44% vs. the third quarter. Says independent media analyst Richard Greenfield: "We're not seeing the kind of dramatic growth we should given the surge in sales of iPods and other MP3 players."
Which brings us to a grand irony: Apple, which launched the digital music revolution, may now be holding it back. Critics say Apple's proprietary technology and its refusal to offer more ways to buy or to stray from its rigid 99 cents a song model is dampening legal sales of digital tunes. "The villain in the story is the iPod," says Chris Gorog, CEO of Napster Inc. (NAPS ), which sells both subscriptions and downloads. "You have this device consumers love, but they're being restricted from buying anything other than downloads from Apple. People are bored with that."
Umm, yeah - we'd much rather buy from bozo outfits that install rootkits on our machines. People are "bored" with the Apple store? Well heck Chris - that sounds like a heck of a business opportunity to me. How about you try *gasp* competing with Apple instead of whining about their business model? Hmm - I decided I'd take a walk over to the Napster store and have a look around - pricing information on their subscription service seems to be pretty well hidden. I wandered by the FAQ, and came across this:
What happens to the music I downloaded to my PC if I cancel my Membership?
If you cancel your Membership, the music you downloaded from Napster will no longer be playable at the end of your current billing period. You can still use Napster Light to play and organize all of the music you own without a membership fee. Access Napster Light with same user name and password. With Napster Light, you can also sample 30-second clips and buy songs for 99¢ and albums from $6.95. If you decide to resume your Napster Membership, your Napster music library will be restored and your downloaded music will be playable again.
And they have the gall (later in the page) to call what Apple does lock in. I can burn CD's off of iTunes to my heart's content. If it's stuff I bought, they don't bring across anything but basic track info, but they don't render my collection worthless either. I'm not sure how this restriction plugs into Napster Lite, where you can buy songs one at a time for 99c. But the main membership - info on which I did find in the FAQ - costs $14.95 per month. Hey Chris - I'm bored with that. Looking over at iTunes, I notice that Apple just charges me 99c a song, and doesn't turn my music off if I decide I like another service better.
Which one of these do you think was set up with the help of our *cough* friends *cough at the RIAA, and which one wasn't?
Travis points out that - when you look at the implementation details - Smalltalk I/O can be as fast as C.
There hasn't been a lot of reporting on New Orleans of late, but a post by Dave Winer got me thinking about the city - it's going to come back, but it won't ever be what it was. Have a look at the history of Galveston, TX, before and after the 1900 storm.
Before the storm, Galveston was an up and coming commercial center, with lots of the nascent oil business going there. Afterwards, that all went to Houston. I expect a similar thing to happen to New Orleans, including the port itself - lots of business that went elsewhere (and found that it could get by elsewhere) simply won't come back. Like Galveston in 1900, the risks will look too high, and - if a full rebuild is necessary - business owners will look to mitigate their risks.
Galveston didn't disappear, of course, and neither will New Orleans. However, Galveston is no longer a commercial center - it's a tourist location. New Orleans is likely looking at the same fate.
Bobby Woolf falls into a common trap - he assumes that most people need massive scaling for their projects:
The article talks about tasks that don't require much business logic. Google just displays search results, e-mail, and map images. Yahoo is the poster child for portals, aggregating existing info and integrating it on the glass. They both use read-only data that can be highly replicated; users can configure the display. Those tasks require minimal programming logic, so PHP scripting and a simple SQL database be all you need (plus a Web server and OS). Even then, huge sites like Google and Yahoo must be doing much more than just using PHP.
But a lot of sites need more than scripting. Do your users need to: Find airline tickets? Trade stocks? Does your implementation need to: Integrate with EISs? Enforce security? Coordinate multiple users updating the same data concurrently? Good luck with PHP scripts. You're gonna need J2EE or .NET for that. I can tell you that this is what WebSphere (WAS) customers are doing.
The dirty secret of the software industry is that most people are, in fact, building fairly simple applications. Most users of things like WebSphere are using it simply as a JSP container - and that's a pretty complex (and expensive) container. Especially when you could build the same thing in Smalltalk in half the time, and for a fraction of the expense. Not to mention that you wouldn't need the army of consultants that WebSphere seems to require.
He goes on to analogize the current debate to the early 90's Smalltalk vs. PowerBuilder debates, and says this:
So LAMP may well work if you want open-source everything and just want to display (and CRUD?) your database. But for full-blown applications hosted on the Web, LAMP won't cut it. AJAX is a cool display technology (see Ajax and Java), but it's only a display; you still need a server behind it running something (LAMP, Java, .NET, etc.). .NET is on the same level as J2EE, and c# is very Java-like, so then that comparison is the old Microsoft-only vs. semi-open-standards and write once, run everywhere argument.
I'd bet good money that the scaling issues faced by Google, eBay, and Amazon are far beyond anything that most web developers will ever need. Funny that they didn't buy into the J2EE/WebSphere camp then; however did they manage it? According to Bobby, it's because they have simple applications. I'd disagree - I'd say it's because they made a rational choice to avoid the absurd levels of complexity in J2EE.
Hat tip to James Governor
I can't follow the link from Digg, but it sounds like artists are starting to get annoyed with Sony - likely because of damage to their sales. There was a piece on this in the NYT a week or so ago (sadly, the Times has now tossed it behind a pay wall) - if the entire DRM idea gets tarred by Sony's missteps, all the better. In the end, the artists are the ones who take the biggest hit when people stop buying CD's from a specific label.
Sela Ward is a fine actress, so it's painful to say this - she doesn't belong on House.. Not her specifically, even - her character. I like this show a lot, but I like it because of the interplay of Hugh Laurie and the three younger doctors. Ward's lawyer character throws the balance off, I think. This week's episode doesn't have her - and it's running a lot better.
To start, we launched Blink with a bevy of marketing dollars and a message very much focused on the individual storage benefits. We were very successful at attracting users ( at its height Blink has 1.5 million members, del.ico.us currently has 300,000 ) and getting them to import their bookmarks into our system.
What I find interesting about this pair of posts is the thought that a company that had 5 times the user base of del.icio.us could be considered a failure while del.icio.us is not. This makes me wonder what defines success here...That the VCs made a profit? I assume that must have been the case with the del.icio.us sale while it clearly was not with the original Blink.com service. Perhaps it's that the founders end up as millionaires? What ever it is, it definitely doesn't seem to be about users.
I tend to agree with Anil Dash, del.icio.us isn't yet a success except for being successful at making the founders and VCs a good return on their investment. If a service can grow to be 5 times as large and still be considered a failure then I think it is safe to say that calling del.icio.us a success is at best premature.
Wow, I remember making fun of Blink "back in the day", but I had no idea they had grabbed so many more users than del.icio.us. I'd call it a success for a simple reason though - they got bought by Yahoo, and Yahoo can fold the service into things that make money. Sobering reading though - for all the hype, I never would have guessed that the user penetration was lower than Blink's.
Peter Yared, CEO of software maker ActiveGrid, spent a critical chapter of his career steeped in Java, the programming language developed by Sun Microsystems In the late 1990s, Yared was chief technology officer of NetDynamics, which pioneered an application server designed to boost the performance of Web sites. It was based squarely on then wildly popular Java. He went on to spend five years as an executive at Sun. So it's especially surprising that Yared holds this view: "Java is a dinosaur."
The article talks up LAMP, but also points out that .NET usage is up. The overall upsurge in interest in dynamic languages is a good thing too.
Wired points out that product placement in TV shows went up by 84% last year - the impact of TiVo and similar devices (I expect most of the upsurge was from cable provide boxes). It seems that the writers and actors are confused about where the money comes from:
TV networks are turning to product placements to fight back against ad-skipping technologies like TiVo, but now some writers are putting up a fight, demanding more pay in exchange for scripting product plugs into their shows.
The issue sparked open protest last month, with both the Writer's Guild of America and the Screen Actors Guild calling for a "code of conduct" to govern the use of stealth advertising.
Code of Conduct? I fail to see how a product placement is more offensive than "message" storylines. Not to mention that the writers are forgetting where the money comes from. Until the iPod model takes over, it's still coming from advertisers. Like newspapers, they aren't reacting that well to disintermediation.
“The villain in the story is the iPod,” says Chris Gorog, CEO of Napster Inc. (NAPS ), which sells both subscriptions and downloads. “You have this device consumers love, but they’re being restricted from buying anything other than downloads from Apple. People are bored with that.”
Translation: "The public likes the competing product better! It's unfair! I'll try to spin my way out of that problem!"
The article comes from Business Week. They must have spent 3 whole seconds on research.
Cees explains why he gave up on Linux on the desktop:
As I wrote, I switched from Linux on the desktop (after almost 10 years!) back to Windows XP. I just was fed up with having to dig around for device drivers and support software for my camera, my scanner, my game pad, my iPod, my phone, my printer, etcetera. That I couldn’t get a 16 bit workflow done under Linux was the breaker. I left Windows XP on my laptop, installed shareware on it, and never looked back.
One of the regulars on the IRC channel made the same point this morning - it's just too much work. If you use Windows or Mac, things "just work" when you plug them in. Sure, you can often get them to work on Linux (eventually) - but in the meantime, how much time have you spent?
That kind of fiddling just isn't interesting for most people - because most people aren't entertained by trawling Google results for device driver information. Sure, Windows has flaws - more than I can count. But it's a lot closer to being a consumer friendly device than Linux is, or ever will be. The dirty secret is that it takes money to write drivers for the huge variety of peripherals on the market - and while Apple and MS have the resources to do that, the open source community just doesn't. Free development just doesn't support that kind of thing, unless it happens to bite a developer with the right knowledge. That's a thin reed to base your hopes on, and it's the one that Linux on the desktop advocates have been counting on.
On the server? Sure, I much prefer Linux. Like cees, I can see it being useful in a locked down corporate environment as well (although, to be honest, I'd go Mac there first). In the general consumer space? Not happening.
Cedric thinks that refactoring in Smalltalk is simpy a toy:
I can't believe that some people actually consider the dynamic refactoring approach used by the Smalltalk IDE as more than just a passing amusement.
I guess the Refacatoring Browser - standard equipment in all modern Smalltalks - is a toy then, and all of us Smalltalkers out here are simply engaged in mental masturbation.
Or maybe, we're busy being productive. Cedric's point about testing being required after a dynamic refactor is true, but pointless - if he thinks you don't need to test code in Java (or C#, etc) after refactoring, I feel sorry for the people he delivers to.
Like Cees, I'll admit that static typing allows for more precision in things like auto-completion. I'm also with him on this:
It’s not like we can’t see any advantage in the dead objects world it’s just that on the balance , the advantage is to dynamic languages. In my experience, they mesh better with how humans think and work a bit fuzzy at times, thriving on interaction, thinking fast and switching fast and needing tools that follow them. I don’t think, probably contrary to a lot of static typing adepts, that software is an engineering discipline (some thoughts I wrote up here and here). Tools need to be more like clay than like the machines in a factory assembly line, and a Smalltalk IDE is the ‘clay-est’ tool I have encountered so far…
I'll take the malleabity - and the molding power it brings - instead of the "power" to use Intellisense with the handcuffs that come with it
Earlier this year I wrote about Dr. Cem Kaner's meticulous chronicle of his experiences trying to get Alienware to fix, and then ultimately to refund his money for, a lemon of a computer he'd purchased from them. Since then, I've received a rather constant stream of complaints about the company, almost rivaling the gripes generated by the commodity PC vendors like Dell and HP.
What's struck me in particular is how similar many of the reader tales about Alienware are to Kaner's in terms of involving both faulty hardware and unhelpful support. "I would tell my story about my Alienware Area 51M laptop in its entirety, but I would merely be reiterating what happened to Cem Kaner," one reader wrote. "Though I did not have to go through the extremes he did to get his machine functional, I did encounter hardware issues right out of the box. I too experienced delayed delivery issues, hardware issues right out of the box, and failing tech support that knew nothing about PCs or basic networking. In total, the machine was sent back for repair three times. The last time it was in repair for a month and when it was returned to me, the modem was broken, the SD card reader they were supposed to fix was still broken, they had chipped the lid in two places, and the DVD+/-RW has some type of thermal paste that drips into the tray and keeps it from opening properly and potentially ruining media. I travel for a living, this machine was purchased to be my office and for its perceived durability and reliability and excellent tech support. Instead, those of us who purchased an Alienware system have all spent thousands of dollars on machines that often equate to little more than a doorstop or glorified paperweight."
The amazing thing is this - how many companies still think that they can provide shoddy/non-existant service and not get called on it. It's been over a decade since the web started to become a player in this area, but blogs - and what they've done to make personal publishing easy - has really accelerated the empowerment of consumers.
Back in 1995, you could put up a website, but getting a hosting solution (and getting the HTML to the site itself) was a chore - far beyond the level of crap that most people want to deal with. Now, it's simple - there are multiple free blog hosting services, and a variety of low cost ones. You can vent your spleen on the cheap, and getting the content posted isn't complicated.
Before that, bad service didn't get beyond word of mouth - and the national (or even local) media only covered truly bad service, the kind that killed/hurt people, or ended up scamming them out of large sums of cash. Now, word of mouth extends around the globe - sometimes, with a kick from digg, or slashdot, or drudge - in minutes. Even without those kicks, search engines make the complaints far more visible than many people seem to think (witness the sort of tale Foster is relating).
Radio Silence as a strategy doesn't work anymore. Too many people are willing to point out the problems, and they get amplified quickly if it's a common problem. If your customer service stinks, people are going to find out. Sooner than you think.
Google is making money mostly from ads - looks like Amazon has a different idea: open up the index, and charge based on consumption. They are opening up Alexa as a platform anyone can build on, and you pay as you go. It will be interesting to see who jumps at that:
Anyone can also use Alexa's servers and processing power to mine its index to discover things - perhaps, to outsource the crawl needed to create a vertical search engine, for example. Or maybe to build new kinds of search engines entirely, or ...well, whatever creative folks can dream up. And then, anyone can run that new service on Alexa's (er...Amazon's) platform, should they wish.
It's all done via web services. It's all integrated with Amazon's fabled web services platform. And there's no licensing fees. Just "consumption fees" which, at my first glance, seem pretty reasonable. ("Consumption" meaning consuming processor cycles, or storage, or bandwidth).
A lot of people have been talking about making search better - here's a platform that might allow some of them to try out a few ideas - without the huge expense of building up the server farm. Something to keep an eye on, that's for sure.
Nowadays, there is a lot of talk about memory management and garbage collectors. I personally think that garbage collectors are a bad thing. If you can't keep track of your data to delete it then how are you going to track it to do something useful? It amazes me that people would say that programmer time would be better spent on more useful tasks. What can be more useful than proper management of your data? This is lazyness plain and simple. It's like anything else in programming like checking error codes, trapping exceptions, using proper syntax, learning new API's, etc. No one wants to do it. But just because you don't want to do it doesn't mean it shouldn't be done.
Lazy? Hardly. GC is simply better for the vast majority of applications. Why is that? As you build a larger application, you'll have a tremendous number of objects flying around in and between modules. The management issue will get to be more and more troublesome, as proper encapsulation techniques will get in the way of the global knowledge necessary to properly manage memory.
So what ends up happening? Developers in large C/C++ projects end up building their own half-baked GC themselves. I say "half baked", because it almost certainly isn't going to be as efficient as the systems in Smalltalk, Lisp, C#, or Java - those were all built by people with deep knowledge of the field. It's not that application developers are stupid (heck, I'm one!) - it's just that GC is not inside the problem domain they work with. Why Vorlath wants to make it one is a mystery. Heres what he says:
Whenever I hear someone say that garbage collection saves time, I know these are beginners. It's not an insult. You just have to keep practicing. If deleting your data is slowing you down, chances are you need more experience. Now, there's a difference between someone who absolutely needs a garbage collector and someone who organises his data using the stack or automatic pointers. These are two different areas completely. I personally don't even think about deleting code. It's a normal part of programming that doesn't take any more or less time than anything else. And you know why I don't even think about it? Because it's all part of managing your data. If your data is organised, it's not even an issue.
Maybe if you have something close to a photographic memory, you can keep track of who's responsibility it should be (and when) to kill of a given object. Not being able to track such things isn't "laziness" - it's a matter of getting overwhelmed by complexity. I'll make a simple minded analogy - Chess is a relatively simple game, but - at any given moment - there are tons of possible moves, and the possibilities expand as you also consider the possible moves of your opponent. There are only a handful of master who can play with (and sometimes beat) software specifically built to track all of those possible moves (and give valuations to them). Are the rest of us lazy because we simply can't track as well as a master, or the custom software?
In Vorlath's world, I guess so. Where does this sort of thing take him? Well, he gets to that:
There's something else I want to discuss and it's having programs continue to run after their state has been corrupted. I, for one, would rather it come crashing down instantly. That way, I can fix it right away. Having software keep executing with a corrupt state where you can't easily trace the problem is not advancing the way we write software. This is a step backwards. If you absolutely need your software to keep running, have redundant systems.
If you make a mistake in freeing a pointer, you can end up with corruption, and have no idea where said corruption is - so crashing is the best possible result. If you use GC, you can have memory leaks (because you have strong holds on objects that you shouldn't have), but the sort of corruption possible with pointers won't come up - unless we are talking about GC added onto a language with pointers. In a fully managed language, you won't get that.
What Vorlath really wants is a small priesthood of experts - but even there, he's unrealistic:
There is a frightning trend going on where people graduate college or university and don't know how a computer works, yet they have their CS or Computer Engineering degree. A clear understanding of how memory works, paging and protection are critical. Also critical is a good understanding of the stack and different calling mechanisms. This will explain why certain language are the way they are, especially C.
The problem with C (the one he's pointing to, anyway) is because of the hardware knowledge and assumptions of the original designers. It was built to be a high level assembler, with the notions learned from the hardware they were familiar with. If I'm writing an RSS aggregator, I don't need to know or care how memory works at the hardware level. I merely need to let people know what kind of minimums the application expects. The stacks and different calling mechanisms? Those can differ across hardware and languages - how specific is this priesthood going to be? This guy's vision of the field is pretty darn blinkered.
We know that the Nintendo Revolution won't be pumping HD - Nintendo has admitted as much. However, here's some interesting speculation from Falafelkid as to what they will be doing - better graphical results using a technique called displacement mapping. Rather than try to quote segments here, I reommend that you head over there and read it. The upshot - games like "Call of Duty" might look almost as nice on the Revolution as they do on the XBox 360.
WonderBranding talks about cultural anthropology as a way of getting market information:
Watch how she not only uses your product but also how she doesn’t … is she unaware of an advantage your product offers but you keep forgetting to tell her about? Or is there something you need to change? Proctor & Gamble is having a great success with an ongoing series of “immersion” studies. They've gotten important feedback from surveys, but it was only when they had a group of mothers wear headgear cameras throughout the day (giving them a bird’s eye view of what the mother sees) that they discovered new ways of packaging diapers and baby wipes that made them easier to use. Information like this is the reason P&G profits are on a steady incline.
I wonder though - are the people who are willing to wear headcams truly representative of your target market? And, does the fact that they know that they are going to be watched change their behavior? I'm not sure there's a better way to gather the sort of information WonderBranding wants, but I think that the data gathered this way should have a few caveats attached.
Their email is: firstname.lastname@example.org
Their address is: WikipedaClassAction.org
PO Box 998
Long Beach, NY
I've been playing "Call of Duty" on my GameCube lately, and really enjoying it - but I was in Target last night, and played the XBox 360 demo of the game. It's pure eye candy. The 360 is still a bit pricey, and the hot power system problem is a little worrisome - but boy, oh boy - does it look nice. I expect I'll get one before next year is out.
I give up on libxml for the time being, and think instead of Chris Petrilli’s comment that ruby (and python) performance is “not quite in the league of Smalltalk (or Lisp, likely), which have extremely mature VMs with on-the-fly compilation and optimization”. Is Smalltalk then much faster than python or ruby, or comparable with C, for the task of parsing moderately large XML files?
No. Time to load and parse my iTunes library file, an 11mb Apple plist, on a 1 GHz G4 Powerbook with VisualWorks Non-Commercial 7.3.1: about three minutes.
That didn't seem right - I use the XML code in VW extensively, so I'm pretty familiar with it. I grabbed my iTunes file (only 2.7 MB) and parsed that - took 5.5 seconds. Well, the two caveats are, that's a smaller file, and my hardware isn't his hardware. With that in mind, I went ahead and created a large XML file. I grabbed the default feed file for BottomFeeder, and saved it as an XML feed list instead of as a binary dump - like this:
file := Tools.XMLConfigFileSupport.XMLConfigFile filename: 'g:\vw74\image\feeds.xml'. file saveObject: RSSFeedManager default subscribedFeedsFolder. file saveConfiguration
That just dumps the 80 sample feeds into a (pretty verbose) XML format - I ended up with a 13 MB file. That seemed large enough, so I tried the parse on that:
content := 'feeds.xml' asFilename contentsOfEntireFile. parser := XMLParser new. parser validate: false. Time millisecondsToRun: [parser parse: content readStream]
That last line times the execution - it ran in 17.9 seconds. Not a couple of seconds, but not 3 minutes, either. There was some GC going on during that, so I'm sure that things could be improved by simply configuring VW with a larger bite of old space up front - in dealing with large amounts of data, a fair bit of time is going to be chewed up either in allocating more memory, or GC'ng if we hit the current limits (as per the memory policy in place).
For this kind of parse to take 3 minutes, either the hardware would have to be very slow, or memory limits would have to be set badly for dealing with larger files. I'm not entirely sure what was going on.
Update: I ran the same code on my Mac Mini - it has a 1.3 Ghz G4 processor, and a paltry 256MB of RAM. The 2.7 MB file parsed in 12.8 seconds, the 13 MB file in 44.7 seconds. Not speedy, but not the 3 minutes reported by Alan Little either - and the Mini is no high end Mac.
I had a look at the Great Computer Language Shootout site this morning, since there's some VW Smalltalk code (and results) posted there. The comparison I was drawn to happened to be with Mono based C# (based on a comment here). There are some issues with the comparison, however:
- Have a look at this test - scroll down, and look at the execution. The source code is filed-in, and then the test is executed. That slows things down. Update - apparently, the code is not filed in first on the site test.
- Have a look at the C# version - the code is compiled, and then executed.
So I did the same thing I did with Troy's post over the weekend - I downloaded the code and did some local shaking out. To get it loaded, I had to create a namespace called ComputerLanguageShootout first, and also create a class named Benchmarks. Once that was done, I could file-in the code.
Then, I tried this:
"File in code, then execute" Time millisecondsToRun: [Smalltalk.ComputerLanguageShootout.Benchmarks nbody: 1000000].
That ran in 11.818 ms. I repeated the process a few times to make sure that those numbers weren't outliers, and they weren't. The original post here had a bunch of comments about filing in first, but I was wrong about that.
So, on to the profiler. Running that, it turns out that the bulk of the time is setting in the Body>>and:velocityAfter: method. Looking at that, we see the following code:
and: aBody velocityAfter: dt | dx dy dz distance mag | dx := x - aBody x. dy := y - aBody y. dz := z - aBody z. distance := ((dx*dx) + (dy*dy) + (dz*dz)) sqrt. mag := dt / (distance * distance * distance). self decreaseVelocity: dx y: dy z: dz m: aBody mass * mag. aBody increaseVelocity: dx y: dy z: dz m: mass * mag
It's no big surprise that this test (or any other test that is heavy on arithmetic) is going to be faster in C# (or C, or Java) than in Smalltalk. Why? Smalltalk isn't that fast on math, floating point math in particular. So if you intend to code up some low level mathematical model, the calculations should be in a different language (and our customers tend to do that - model in Smalltalk, low level math in C or C++). What this really points out is that micro-benchmarks aren't that useful. Most applications push data around, typically involving a database - and math operations don't dominate those kinds of applications. In which case, something that lets you code faster is going to help. As usual, pick the best tool for the job at hand.
On the Smalltalk IRC channel, it was pointed out that a lot of the other language tests vary just as widely. Ultimately, the shootout site simply isn't doing great cross language tests.
I found a couple of great gag gifts - they would work really well for kids, too. The Potato Gun:
The Original Harmless Squeeze powered Potato Gun. Just press the tip of the gun into a raw potato, break off a small pellet, aim, squeeze the handle and it will shoot the harmless potato pellet far across the room. You can get hundreds of shots from a single potato. Potato Gun size is 6 inches and simple to use. One Potato Gun per Package.
And the Fart Pen:
It's a real pen shaped like a finger and when you pull on it, out will come farting sounds. Great for that boring night of homework or maybe make your friends laugh at this very comical pen.
The release ball is rolling here at Cincom - barring some unforeseen issue, the winter release should go live before Christmas puts everything into slow motion. Stay tuned - I'll make an announcement when it's all gold.
File under "no good deed goes unpunished" - some lawyers who smell money want to shut Wikipedia down.
Update: If you send email using the link on that page (the one that asks for information, or registration of complaints) - and you register a complaint (like, say, this suit is a bad thing) - you'll be told that your feedback wasn't asked for. Welcome to the echo chamber!
I found this to be interesting - take the various biometric security measures that are being installed, and the supposed ways that hi-tech criminals/spies get around them:
Eyeballs, a severed hand, or fingers carried in ziplock bags. Back alley eye replacement surgery. These are scenarios used in recent blockbuster movies like Steven Spielberg's "Minority Report" and "Tomorrow Never Dies" to illustrate how unsavory characters in high-tech worlds beat sophisticated security and identification systems.
However, it may take nothing more than a very low-tech spoofing attack - play-doh, anyone?
Fingerprint scanning devices often use basic technology, such as an optical camera that take pictures of fingerprints which are then "read" by a computer. In order to assess how vulnerable the scanners are to spoofing, Schuckers and her research team made casts from live fingers using dental materials and used Play-Doh to create molds. They also assembled a collection of cadaver fingers.
In the laboratory, the researchers then systematically tested more than 60 of the faked samples. The results were a 90 percent false verification rate.
I guess that's why you need the boring human security guard - they can look for those kinds of scams.
ARmadgeddon has some interesting points about the data that analyst firms (Gartner is the main one talked about, but the point is universal) base their decisions on. The bottom line - it's a thin reed. Plenty of good take aways, but here's a really good thing to keep in mind:
Another issue is that data points are limited to clients of the analyst firms. Gartner’s CEO has stated that Gartner has only 15% of the possible end user market at companies. Does Gartner’s or Forrester’s client base represent a statistically valid sample of the overall IT buyer market?
The client base is also self selecting, which creates problems of its own. So - the question you want to ask yourself is this: Are the answers you get back worth what you pay for them?