More companies join DataPortability

January 10th, 2008 § 4 comments § permalink

Two days ago, Google and Facebook shook the industry when they decided to join the DataPortability Workgroup, as enthusiastically reported by Read/Write Web. Today, with the same enthusiasm–and not without a certain sense of disbelief–Read/Write Web is reporting that three more big players have joined the initiative: Flickr, SixApart and LinkedIn. As with Google and Facebook, their representatives will not be mere employees but people with a history of involvement with the issues sponsored by DataPortability.

It’s quite obvious Google and Facebook move was what prompted those three companies to join the group. Likewise, we can certainly expect more companies to join the group in the coming days and months. So what began as an idea to provide guidance to the industry may become a real force for the implementation of portability standards in the next years.

The more important thing about the whole thing is that all standards supported by the DataPortability Workgroup are open and, together, represent a natural deterrent against the kind of attitude we often see expressed by Microsoft, that is, embracing standards and later changing them to make them slightly incompatible with other implementations to keep their dominant position in the industry.

This year seems more and more promising for open standards. OpenId is being discussed and implemented by lots of applications, and much more is happening each day. The next couple of months may eventually become marks in the history of the Internet. I certainly hope so.

A few notes on Twitter

January 9th, 2008 § 0 comments § permalink

Twitter:
[1] talk in a light, high-pitched voice
[2] idle or ignorant talk

The Oxford Pocket Dictionary, 2007 edition

In light of those definitions, one could wonder why Twitter chose such a name. Then again, maybe the joke is one us.

I’ve been using Twitter intermittently for a few of months now. I started using it in a BarCamp for which I provided live coverage for my Portuguese blog readers, and decide to experiment with both formats simultaneously: I would write a more elaborate entry after a particular discussions, and would try to post tidbits of the conversation on Twitter while people were talking.

The experience ultimately left me dissatisfied with Twitter. Maybe I’m not a multi-tasking person, but trying to post to Twitter, while listening to people talk, and also trying to keep with replies to the Twitter entries proved too distracting to me.

After a couple months of usage, I can’t say I have any special insights about Twitter. Twitter seems to be IRC done socially. IRC has long been a popular application among a certain kind of Internet user, but it depends on a very specific application and clear choice about what channels one should follow. Twitter changes the equation by allowing a user to subscribe to people instead of channels. Obviously, its lacks the focus of a dedicated IRC channel, although it provides a way for its users to reach across followers and follow specific subjects.

This mechanism can be used efficiently by people trying to keep or meetings and conferences, although some users will be uncomfortable with the flood of information tracking can unleash. But considering that neither IRC nor IM can provide such immediacy across people not linked by personal contacts, Twitter has a definitive advantage here.

The ability to follow people and occasionally track specific channels may prove the only advantage of Twitter to me. Using it, I’m able to keep updated about the notational Zeitgeist of the people I’m following–and, indirectly, of the people they are in turn following. Also, if the people I follow belong to my market, I may be able to glimpse trends by seeing what it’s calling their attention during the day. Of course, this can be quite misleading and people seeking insight may not find what they are looking for.

Twitter is a noisy tool. Keeping it continuously on is a sure way to lower productivity. More than IM, because most people still respect status messages, while Twitter gives an implicit permission to call your attention–if you are following somebody, it’s quite obvious you want to see what he or she is posting. This stream of consciousness can be very distracting.

Ultimately, Twitter has outsourced office talk, and the same restrictions apply. You may think people are not paying attention only to find your boss listening over your shoulder.

I will probably keep using Twitter by applying the same logic I apply to IM. I follow status conventions, and I rarely allow people to bother me when I’m signaling I’m busy. With Twitter, this mean turning it off whenever I need to focus on a problem. What Twitter itself may gain with my participation remains to be seem.

Facebook and Google join the DataPortability Workgroup

January 8th, 2008 § 0 comments § permalink

Read/Write Web is reporting–with great enthusiasm, I may add–the both Google and Facebook have agreed to join the DataPortability Workgroup. I have not found any evidence of the news on the DataPortability site itself, but coming from Read/Write Web there is no chance the news are no true. It’s interesting to realize the announcement came just a few days after the controversy around the removal of Scoble’s account from Facebook.

The DataPortability group is an initiative working to promote the reuse and transparent transference of user data across Web applications. This is a huge challenge, but it’s good to see some smart people working on the multiple issues involved. The presence of two of the most social companies in existence today will certainly give the project a new measure of legitimacy–especially considering the people who will represent those two companies on the group.

One of the early documentos created by the group shows that one of the primary objectives of the group is to use existing technology to leverage the transformations needed to reach their goal. This is a good strategy since it improves the chances the recommendations made by the group will be followed and that they will be easy to implement.

With Google and Facebook joining DataPortabily, the eventual creation of a portability API may represent a turning pointing for Web applications as it will empower users and create an entire new industry around the possibilities of managing such data. Of course, the entire problem revolves around issues of privacy and that will be a tough nut to crack. Hopefully, with Google and Facebook, and the other companies that will surely follow their lead, working on the problem, an interesting and useful solution may arrive soon.

As the Read/Write piece says, that may represent a magical time for the development Web. That’s what we are hoping for, anyway.

Series and we, the morons

January 7th, 2008 § 2 comments § permalink

I think–no, strike that, I’m sure some writers think we are complete morons.

I was watching an episode of CSI Miami today that showed, in the opening teaser, the following piece of dialog between Horation Cane–the series’ main character–and his medical examiner. A woman was dead, and the medical examiner told Cane that she had found two kinds of wounds: one caused by a serrated knife, which was the one who had really killed the woman, and another, caused by a plain knife and inflected post-mortem in the shape of the letter Y. Horatio Cane–made famous by his cheese one-liners, proceeds to utter the following statement:

“That would mean the second wound is a serial killer signature”

No freaking kidding, Sherlock!

As if the fact that I’m somewhat educated, that I’m watching a criminal drama and, therefore, used to this kind of theme, would not be sufficient to think the exact same think a hundredth of second after hearing what the medical examiner had said.

I also remember an episode of House in which he describes to his underlings how the immune system works. If he has to explain that to a neurologist, and immunologist (!) and a specialist in intensive care, he really needs different people working with him. Maybe that’s way they all left him in the end of the third season. :-)

Anyway, sometimes you need to explain certain terms and motives to the people watching the series; sometimes, what the specialist is trying to say is too technical and some people will not understand it otherwise. But you don’t need to insult the intelligence of the other people who do understand what is being said. If House was Grey’s Anatomy, I would gladly let it pass. But House watchers deserve better.

In the CSI Miami episode, it would just be a matter of waiting until the story developed enough to show the background of the assassin, something they would have to do anyway. In the House episode, the explanation could easily have been provided to another person, the patient itself or his family.

And to think that some TV executives that will try t explain the poor rating by blaming by saying that the public didn’t receive the show well. Of course, I’d say, with such writing people will deliberately not watch.

RIA in 2008

January 6th, 2008 § 0 comments § permalink

Tim Bray began his predictions for 2008 saying that this is the decisive year for RIA applications: either they become mainstream or they will be relegated to the dust bin of history. Given the news about Microsoft planning to overhaul its entire site to show their RIA platform, Tim Bray is probably right in saying this will be a important year for RIA technologies.

Bray makes an interesting point when he says that he tends to associated “richness” not with interface–which, he also says, is something only developers care about–but with the interactive capabilities of applications, regardless of the technology they use. I agree, but I also think that Silverlight and Flex (and similar technologies) may have a useful role in a different place, providing different levels of interface in a very specific class of applications: internal sites.

Obviously, Microsoft and Adobe are setting their sights much higher than that. The former with its pathological need to control the industry; the latter, with its duplicity about open sourcing its products. I’m not worried. Public facing applications have different interaction and accessibility needs, and no developer is going to use technologies that will actively harm their applications on those two areas. One of the main problems plaguing alternative interfaces is that they are always trying to catch up with what users have grown used to and they can never succeed. Between dealing with the cognitive dissonance they force users to experience and dealing with multiple hosting system, they don’t have the leverage to compete with the advances being made in JavaScript integration.

Another interesting point Tim Bray makes is that most applications are Web-enabled to some extent–even if users don’t realize it. Add that to the growing research in offline/online integration and we are dealing with an entirely different playing field.

Contrary to Bray, I will risk a definitive prediction: RIA, with regards to Flex and Silverlight, will indeed be recognized as a secondary option this year, and no big applications–Microsoft site notwithstanding–will be launched using either technology. Conversely, we will see people using Silverlight or Flex in internal applications.

The rest of the year, however, will belong to Ajax.

Motivation and testing

January 5th, 2008 § 0 comments § permalink

I guess I can safely say that most programmers consider testing is an essential part of the software development process–even those who are not using any format framework right now beyond following a prepared script about what should be tested and how it should be tested.

Ironically, the parallel ascension of Web applications as the preferred form of modern user interfaces and agile methodologies as a more efficient alternative to the usual coding creation systematics offered a unique opportunity to experimentation in the testing arena. Web applications are usually easier to test because you can automate most of the testing. Since they are not event-driven but based on linear protocols, testing Web applications can be done with less cost and more productivity. Likewise, agile methodologies bring to the playing field a need of experimentation to create more competitive practices that generated hundreds of new tools with a very pronounced effect on testing.

The end result is an increased awareness by developers of the testing process. Automated tests are becoming a premise of modern development techniques instead of a optional step in the development process. The benefits are clear: better management of changes in requirements, more robust products, improved integrations, and even better documentation depending on the tools a developer is using.

Even though those benefits are always touted as the main gains from testing, there is an additional benefit that is always overlooked people talk about the subject: the motivational gains testing can bring to the development process in the day to day coding.

Most new projects have complicated beginnings, with choices being made in the spur of the moment that will heavily influence their life-cycle. The motivational benefits of tests in the beginning of such projects can contribute to their development in two different ways: first, by making visible the project quality level from the first second; second, by the pure pleasure a passing test suite can bring to a developer.

People can be strongly influenced by what they see and a passing test suit can show that the work being done is not random but follows a precise structure that developers will then strive to keep.

Even legacy projects can use that to their advantage. By incrementally creating a testing process, developers will feel they are gaining control of a otherwise unyielding mass of code and that will be converted in other benefits as well, with better understanding of the code and progressive knowledge diffusion being two of the most important ones.

To underestimate the effect this kind of motivation can have on developers is the same as underestimating the human factor. Testing provides exactly the characteristics needed to increase motivation while also providing tangible technical benefits. And although the human factor is rarely factored in the choice of a methodology, the past few years have shown an increased awareness in this subject that is quite heartening.

So, the next time somebody complains that testing is a waste of time, maybe you don’t need to point only the technical benefits–the human benefits can be a strong selling point as well.

Freakonomics

December 23rd, 2007 § 1 comment § permalink

I’m probably the last person on Earth to read Freakonomics, given the almost cult-like status it has enjoyed since it was published in early 2005. Even though it’s a quick reading, I decided to wait for a time when I would be able to dedicated more thought to it than just breezing through its pages.

The book, written in a partnership between Steven D. Levitt, an economist, and Stephen J. Dubner, a journalist, reflects, almost in its entirety, the theories created by Levitt, which are then described in a more simple way by Dubner. Much like The Tipping Point, Malcolm Gladwell similarly themed book, Freakonomics quickly grew in notoriety among the intelligentsia because of its allegedly non-conventional explanations for phenomena that had long intrigued and defined the scientific community’s abilities to explain them. Examples of those phenomena include the reasons for corruption among white-collar workers and why crime declined in the US beginning in the 90s.

Particularly, although I enjoyed the book, Levitt and Dubner’s treatment of the themes presented in the book was too superficial and lacking in the more intriguing analysis provided, for example, by Gladwell in his two more famous books.

At the core of Freakonomics are those four ideas:

  • Experts will use their knowledge for their own benefit
  • Incentives are the bases of the modern economy;
  • Conventional wisdom is often wrong; and
  • Small events can have profound consequences

Of the four ideas expressed above, none is particularly revealing or new. In fact, if one has spent more than a couple minutes reflecting on how the world really works, those four ideas are self-evident. Spending an entire book trying to demonstrate them is a pointless exercise.

The only winning point for the book are the cases used by the authors to demonstrate the aforementioned ideas. Using curious comparisons–for example, what do school-teachers and sumo wrestlers have in common–one or more specific real-world example are present for each of the points above showing how, in many cases, conventional wisdom is wrong.

Granted, we, as a species, tend to cling to those explanations we hear the most. But whenever subjects like those presented by Levitt and Dubner are involved, we rarely do that. Weaving both kinds of examples is just a way to make those less interesting cases to seem more than what they really are: myth.

Freakonomics lends itself very well to controversies, as the authors are quick to point. Not without a certain amount of irony, Malcolm Gladwell wrote a glowing blurb for Freakonomics even though the reasons he presents in The Tipping Point for the decline of organized crime in America are directly at odds–and are even ridiculed–by Freakonomics. A lot of discussions has ensued about this particular point on the Web including a quick exchange between the authors involved (1, 2, 3), which, unsurprisingly, ended with both parties agreeing to disagree.

Although I’m still able to recommend Freakonomics as an interesting reading, based on the intellectual estimulation one can derived from analyzing what Levitt and Dubner are saying, I must confess myself disappointed by the book, especially after the glowing reviews the book earned. The Tipping Point, at least the way I see, provides a much more interesting reading since it attempts to create a framework to explain the changes. Freakomonics could certainly have used a lot more content and analysis. At the moment, however, I can’t say I find what Levitt and Dubner wrote so surprising.

Castle: MonoRail + ActiveRecord

December 23rd, 2007 § 0 comments § permalink

In the past couple of .NET projects my company developed, since we met with no objections from our clients, we decided to use Castle (by way of its two sub-projects MonoRail and ActiveRecord) to see how well it would perform. Unsurprisingly, considering the care that went into the Castle code, they made .NET development altogether more bearable.

Castle is a collection of projects that includes database access layers (using NHibernate to power a ActiveRecord implementation), templating engines (of which NVelocity and Brail are but two examples), and a series of other services geared to rapid application development.

Although my experience with Castle is still small, I’m liking it. I always considered C# a good programming languages and many of its characteristics fit very nicely with the way I like to develop when using a ORM implementation. For example, the way Castle implements ActiveRecord is, at least in my opinion, a much better way to see what’s going one–a nice blend, indeed, of the Rails and Django approaches.

Obviously, since C# is not a dynamic languages, some things are much hard–or at least, much less flexible–than their Rails or Django counterparts. Castle is also lacking some accessories we’ve grown to love in Rails; to wit, the console and the database shell. Nonetheless, it also shines when debugging is necessary since Rails lacks a decent debugger (although Netbeans, if you are incline to use it, solves the problem nicely) and Django is also missing debugging tools.

Looking at the changes already present in C# 3.0, I can see Castle becoming even more pleasant. At the moment, it is already saving us a lot of work and I’m sure it will be a lot better in the near future.

Turbo

December 6th, 2007 § 0 comments § permalink

The first programming tool I ever used was Turbo Pascal 5.0, in 1994. A 5 1/4 disk, passed around by a professor, was the gate to a world that had interested me since my first readings about computers and their capacity to be told what to do. From release 5.0, I quickly jumped to 5.5, which offered rudimentar OOP support, and soon was using 6.0, which allowed programmers to use much more interesting OOP features and had excellent graphic support. I started programming my own graphical window manager until I realized it would be too hard to compete with Windows.

My interest in Borland products didn’t dwindle soon. After a brief fling with Turbo C++ 3.0, I went on to program in Delphi from 1997 to 2003, with sporadic uses until 2006. When the company I worked for changed its entire product platform to .NET, I had no choice but to follow. Borland’s frequent strategic mistakes didn’t help as well. Soon, one of my favorite tools was just a memory. I still have a copy of Borland Delphi 6, which I purchased with my own money, but the CD has probably stopped working by now.

After so much time away from the community–I used to be very active in the Borland newsgroups, specially those related to Web programming–I was surprised to hear that Borland restored and modernized their Turbo line of tools. There is now both Turbo Delphi and Turbo C++, new versions of Turbo Pascal and Turbo C++. For those into Microsoft tools, there is also a Turbo Delphi for .NET and Turbo C#.

Obviously, those are basic versions, stripped from any professional or enterprise features. Nonetheless, it’s nice to see Borland returning to its roots, even though those tools won’t sell enought to justify their existence. Then again, who knows, names can be powerful. Since there a free version of Turbo Delphi, I guess I’ll be programming in Delphi soon again.

Better than Turbo Delphi would be a new version of Turbo Pascal. I still have a disk around with lots of interesting programs to run.

PDF::HTMLDoc, release 0.1.0

March 5th, 2007 § 1 comment § permalink

The project has been approved at RubyForge, so it can be downloaded from there now or installed via the usual gem install htmldoc command. The documentation is accessible as well, and there is a Subversion repository for those interested in download the files directly.

As I said in the previous entry, I hope this library is useful to others as it was to me.