Online Primary

January 30th, 2008 § 3 comments § permalink

Via Read/Write Web, an interesting site that simulates the US primary: Online Primary. According to the article, the site is an experiment in online elections–something I believe we are very far from being able to do, not because the technology is not here but because we can’t trust the established providers.

Very few people have “voted” using the site so far but it’s interesting to see that Democrats seems to be more likely to engage in such experiments and it’s also interesting to note the Barack Obama almost doesn’t have votes in the “Anybody But” category.

The Democratic nomination being about people rejecting a candidate and not about people choosing one, it will be fascinating to follow the site and see the nascent trends.

Living in Code

January 27th, 2008 § 0 comments § permalink

God said, “Cancel Program GENESIS.” The universe ceased to exist.

Arthur C. Clarke

The Universe as a virtual machine or a simulation is a very old idea. Even outside of science, many culture have thought of the material existence as the dream of a god.

Christianity has always dealt introspectively with questions about the relationship of God and the Universe. For example, if only God existed before time and space, where is the Universe in relation to God, and whether the Divinity had to limit Himself when He created the Universe.

More recently, with the rise of favorable conditions, many people have started to devote more time to this kind of exploration–which is a very natural curiosity I may add, considering we all want to know how and why we are here, in this particular time and state.

So it’s no surprise that a recent article, The Physical World as a Virtual Reality, attempts to frame the virtual reality question in light of modern physics. The article is the result of a scientist’s exploration about the implications of a virtual Universe within our current physics framework.

It’s a fascinating reading although no conclusion is given, and no attempt is made to create any mathematical models around the questions present–something I doubt is possible now, and which, in fact, may not be ever possible. Of course, if we were able to prove that the Universe is a simulation, the implications would be civilization-changing (Simulacron-3 / The 13th Floor are very good fictional explorations of those themes).

But more interesting than that would be attempts to hack the code of the Universe, changing and introducing new laws. One could imagine an infinite series of Universes, each running their own giant simulations and experiments.

Of course, that begs the ultimate questions: if we are in a simulation, which form do take blue screens of death?


January 22nd, 2008 § 0 comments § permalink

In Blink, Malcolm Gladwell uses musician Kenna as an example of good music that is not marketable because marketing people can’t usually recognize it as good but who knowledgeable music lovers will love. According to Gladwell, this is an example where just sampling something will not yield accurate results when using intuitive expertise.

Regardless of Gladwell’s conclusions, I decided to try and hear a bit of Kenna’s music–and I was floored. Kenna’s music is unclassifiable. It’s a powerful mix of many styles, so well matched that one can’t help but listen endlessly to the variations just one song can provide.

I’ve been listening to the songs of the Make Sure They See My Hands, and the variation is unbelievable. Daylight, the album’s first song, for example, opens with a very New Age intro, evolves to a mix of soul and eletronica, and finally becomes an operatic rock song. Be Still, on the other hand, has the melodic roots of a traditional rock son but also includes a soft blend of synth pop that makes it unforgettable.

All other music share this kind of diversity, using a mix of hip hop, house, synth pop, eletronica, soul, rock, and many other styles that will probably satisfy even the most demanding music lover. The U2 influence is evident in many songs (Sun Red, Sky Blue could have been a U2 song) and I liked that aspect of Kenna’s music as well.

In short, a very worth addition to my music library.

Needed: a new paradigm for Web development

January 18th, 2008 § 0 comments § permalink

In the past few days I have been thinking about the future of development–especially about the growing interest in tests and domain specific languages, and about the new trends in Web development. I was surprise to realize that, despite the fact the much talking has been done about how they may revolutionize the field, no significant application or framework is combining those concepts in something truly new.

The historic of the field is abysmal. We are now forty years into a period of very few changes in the conceptual basis of software development. For twenty years we have been using basically the same tools, practicing the same moves, and not moving are all. The industry remains bound to the minefield of object oriented programming, relational databases, and bottom-up design.

With regards to Web development, for example, although innovative in many ways, Rails and Django share two flaws that will make them obsolete as quickly as the other many frameworks that have graced the field in the last decade.

The first flaw is conceptual fragmentation. In an attempt to make Web development “painless”, those two frameworks and their descendants have diluted the way the application domain is considered in the application. It’s a more manageable–dumbed-down, if you will, way to develop application but the disconnection between code and domain is fairly evident.

The second flaw is the fixation of opinionated solutions. The use of REST by Rails is a good example of this kind of fixation. REST is a very useful concept, even necessary for some applications, but Rails half-baked external solution, full of accessory tricks, is sub-optimal. But Rails developers are sticking to it without questioning what it represents for their applications because Rails is opinionated software.

In fact, many of those so-called modern frameworks are just pretending complexity does not exist or that they can be easily managed by the use of small methods and a couple of tests.

Test-driven development is now being considered a silver bullet. New developers are using it as a panacea–as a way to guide design as if it would be possible to analyze the problem domain of the application by peering at the small window offered by TDD. The enormous amount of examples showing how to test what has been already tested is quite insane.

Seaside, which I tend to defend as a next step in Web development because of its innovative application of old concepts and its refusal to submit to the common solutions, is not the full solution though. Is great, it’s necessary, but it is still a step below what we really need.

Hopefully, the interest in concepts like language oriented programming will invite attempts to solve the Web development problem is new ways that will transform the field until the next revolution is needed.

Maybe we need a way to generate executable specifications that will really a way to build applications, and not a inferior way to model the expected behavior of an application. Maybe that can be a New-Year resolution: to think of a way to connect the dots, to join the loose treads created in the past twenty years. Is anybody up to it?

Ted Nasmith’s illustrations

January 16th, 2008 § 0 comments § permalink

Ted Nasmith is one of my favorite artists. My first contact with his works came through a friend who introduced me to his illustrations based on the books by J. R. R. Tolkien. Nasmith is one of the best illustrators of the Professor’s work and his drawings and paintings have graced dozens of publications. In fact, many of the scenes in the movie version of The Lord of the Rings are directly taken from his work.

A couple days ago I found that he is drawing new pieces based on the books in George R. R. Martin‘s excellent series, A Song of Ice and Fire. The few drawings displayed on the site are incredibly beautiful and evocative.

Whether you are a fan or not, you will not regret spending a few minutes on Nasmith’s site. In fact, it’s very likely you will spend a lot of time. It will be worth your while.

Autonomic Debugging

January 15th, 2008 § 0 comments § permalink

I’m reading Blink, by Malcolm Gladwell, which is about the ability to arrive at correct decisions from minimal information–in other words, in a instinctive or intuitive way. I’ll write more about the book later, by I’ve been thinking about Gladwell’s argument and how it would apply in the field of software development.

What occurred to me is that experienced programmers are able to the make the same kind of instantaneous judgments, especially when they are debugging a program. I can remember countless occasions in my programming career when the simple act of looking at the code, without even trying to read in detail what was written, would generate a clear picture of what was wrong with that specific part of the application.

I think any other programmer would be able to say the same. That ability seems to be a mix of general programming knowledge and specific application knowledge. And the longer you program, the better you will be at spotting problems in the presumed function and structure of the code. It doesn’t matter if the problem is simple–duplicate rows because of a missing join statement, for example–or complex–subtle behavior problems in the application because of slightly changed configuration parameters.

It’s interesting to compare the behavior of two differently experienced programmers. Curiously, I have been doing something like that for a while, even before I started to read the book, and I think Gladwell is quite right here. I don’t agree with many of his arguments in the book, but the basic relationship between expertise and intuition is something we often miss.

The converse is also interesting, the times when instinct fails. That may cause a programmer to spend hours looking for a ridiculously small problem–a wrong letter in a protocol definition that will prevent the entire program from working and a misleading error message. The fact the this kind of problem can be solved by falling back (taking some time away from the problem or using a second opinion) indicates that the mechanism is, to a certain extent, resettable.

Anyway, it’s quite interesting to think about the way our mind works and the ability it has to make those instantaneous comparisons and classifications.

Coding Elegance

January 11th, 2008 § 4 comments § permalink

The equivalence between elegance, beauty and correction is almost an axiom in the field of mathematics. Bertrand Russell expresses this correlation thus:

Mathematics, rightly viewed, possesses not only truth, but supreme beauty–a beauty cold and austere, like that of sculpture, without appeal to any part of our weaker nature, without the gorgeous trappings of painting or music, yet sublimely pure, and capable of a stern perfection such as only the greatest art can show. The true spirit of delight, the exaltation, the sense of being more than Man, which is the touchstone of the highest excellence, is to be found in mathematics as surely as poetry.

Bertrand Russel, The Study of Mathematics

Code, once we consider its mathematical roots, presents the same intrinsic correlation. Although it is too much mutable to evoke the cold and austere beauty to which Russell alludes, the fact that code and its other products exhibits the same aesthetic imperatives is obvious even to the most inexperienced programmers. Even users can occasionally apprehend those aspects of code when they talk about the way a given application works and how functional and usable it is.

Most of that elegance derives from the incremental economy one can achieve by successively refining a body of code. The author of The Little Prince describes those steps with the following words:

Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.

Antoine de Saint Exupéry, Terre des Hommes

Exupéry criterion is an excellent validation tool for what code should be–and by extension, any of its products–in its final form. There is beauty and perfection to be found in code, to borrow Russell words, as surely as there is beauty and perfection in the most cherished poems.

To the intelect of programmers, this beauty is visually clear in what they product, easily expressed in the successive reductions they can perform to achieve a core of functionality that will stand the test of time. Obviously, that perfection depends both on the programmer and the tools he chooses to employ, but it’s available to any practitioner of the craft willing to make the effort to become a master craftsman. As another great programmer said:

Ugly programs are like ugly suspension bridges: they’re much more liable to collapse than pretty ones, because the way humans (especially engineer-humans) perceive beauty is intimately related to our ability to process and understand complexity. A language that makes it hard to write elegant code makes it hard to write good code.

Eric S. Raymond

Being a function of a developed sense of programming, I believe it’s possible to purposefully chose to code beautifully. It’s a matter of time and options, something about every programmer should think regularly in the course of his career. Training oneself to recognize beauty may seem far fetched, considering that reading code is much harder than writing it, but that may be the key to the task: beautiful code will be much more readable than ugly code, and that will help programmers to identify and recognize good code.

Ultimately, the challenge of every programmers is to learn to code elegance, teaching himself or herself to recognize code that meets standards of concision, simplicity and beauty–which brings us to another quotation:

Simplicity carried to the extreme becomes elegance.

Jon Franklin

My advice to those who are beginning their programming careers and also to those who are feeling that their code is becoming bloated and unwieldy is this: train yourself to code in a way that will show the problem solving intent of each line, and that your code is the best way to solve the problem at hand.

In less time than you will realize, elegance will be second nature to you, with all benefits it brings. It’s hard work, but worth it.

More companies join DataPortability

January 10th, 2008 § 4 comments § permalink

Two days ago, Google and Facebook shook the industry when they decided to join the DataPortability Workgroup, as enthusiastically reported by Read/Write Web. Today, with the same enthusiasm–and not without a certain sense of disbelief–Read/Write Web is reporting that three more big players have joined the initiative: Flickr, SixApart and LinkedIn. As with Google and Facebook, their representatives will not be mere employees but people with a history of involvement with the issues sponsored by DataPortability.

It’s quite obvious Google and Facebook move was what prompted those three companies to join the group. Likewise, we can certainly expect more companies to join the group in the coming days and months. So what began as an idea to provide guidance to the industry may become a real force for the implementation of portability standards in the next years.

The more important thing about the whole thing is that all standards supported by the DataPortability Workgroup are open and, together, represent a natural deterrent against the kind of attitude we often see expressed by Microsoft, that is, embracing standards and later changing them to make them slightly incompatible with other implementations to keep their dominant position in the industry.

This year seems more and more promising for open standards. OpenId is being discussed and implemented by lots of applications, and much more is happening each day. The next couple of months may eventually become marks in the history of the Internet. I certainly hope so.

A few notes on Twitter

January 9th, 2008 § 0 comments § permalink

[1] talk in a light, high-pitched voice
[2] idle or ignorant talk

The Oxford Pocket Dictionary, 2007 edition

In light of those definitions, one could wonder why Twitter chose such a name. Then again, maybe the joke is one us.

I’ve been using Twitter intermittently for a few of months now. I started using it in a BarCamp for which I provided live coverage for my Portuguese blog readers, and decide to experiment with both formats simultaneously: I would write a more elaborate entry after a particular discussions, and would try to post tidbits of the conversation on Twitter while people were talking.

The experience ultimately left me dissatisfied with Twitter. Maybe I’m not a multi-tasking person, but trying to post to Twitter, while listening to people talk, and also trying to keep with replies to the Twitter entries proved too distracting to me.

After a couple months of usage, I can’t say I have any special insights about Twitter. Twitter seems to be IRC done socially. IRC has long been a popular application among a certain kind of Internet user, but it depends on a very specific application and clear choice about what channels one should follow. Twitter changes the equation by allowing a user to subscribe to people instead of channels. Obviously, its lacks the focus of a dedicated IRC channel, although it provides a way for its users to reach across followers and follow specific subjects.

This mechanism can be used efficiently by people trying to keep or meetings and conferences, although some users will be uncomfortable with the flood of information tracking can unleash. But considering that neither IRC nor IM can provide such immediacy across people not linked by personal contacts, Twitter has a definitive advantage here.

The ability to follow people and occasionally track specific channels may prove the only advantage of Twitter to me. Using it, I’m able to keep updated about the notational Zeitgeist of the people I’m following–and, indirectly, of the people they are in turn following. Also, if the people I follow belong to my market, I may be able to glimpse trends by seeing what it’s calling their attention during the day. Of course, this can be quite misleading and people seeking insight may not find what they are looking for.

Twitter is a noisy tool. Keeping it continuously on is a sure way to lower productivity. More than IM, because most people still respect status messages, while Twitter gives an implicit permission to call your attention–if you are following somebody, it’s quite obvious you want to see what he or she is posting. This stream of consciousness can be very distracting.

Ultimately, Twitter has outsourced office talk, and the same restrictions apply. You may think people are not paying attention only to find your boss listening over your shoulder.

I will probably keep using Twitter by applying the same logic I apply to IM. I follow status conventions, and I rarely allow people to bother me when I’m signaling I’m busy. With Twitter, this mean turning it off whenever I need to focus on a problem. What Twitter itself may gain with my participation remains to be seem.

Facebook and Google join the DataPortability Workgroup

January 8th, 2008 § 0 comments § permalink

Read/Write Web is reporting–with great enthusiasm, I may add–the both Google and Facebook have agreed to join the DataPortability Workgroup. I have not found any evidence of the news on the DataPortability site itself, but coming from Read/Write Web there is no chance the news are no true. It’s interesting to realize the announcement came just a few days after the controversy around the removal of Scoble’s account from Facebook.

The DataPortability group is an initiative working to promote the reuse and transparent transference of user data across Web applications. This is a huge challenge, but it’s good to see some smart people working on the multiple issues involved. The presence of two of the most social companies in existence today will certainly give the project a new measure of legitimacy–especially considering the people who will represent those two companies on the group.

One of the early documentos created by the group shows that one of the primary objectives of the group is to use existing technology to leverage the transformations needed to reach their goal. This is a good strategy since it improves the chances the recommendations made by the group will be followed and that they will be easy to implement.

With Google and Facebook joining DataPortabily, the eventual creation of a portability API may represent a turning pointing for Web applications as it will empower users and create an entire new industry around the possibilities of managing such data. Of course, the entire problem revolves around issues of privacy and that will be a tough nut to crack. Hopefully, with Google and Facebook, and the other companies that will surely follow their lead, working on the problem, an interesting and useful solution may arrive soon.

As the Read/Write piece says, that may represent a magical time for the development Web. That’s what we are hoping for, anyway.

Where am I?

You are currently viewing the archives for January, 2008 at Reflective Surface.