The Seaside Bookshelf

February 14th, 2008 § 0 comments § permalink

To those curious about the way Seaside applications are structured or just looking for a simple example to see how they differ from the other more usual Web frameworks, I’m making available the code of a simple experiment of mine: a small system to keep information about the books I’m reading, have read or intend to read.

For the remote possibility somebody asks about this, the system is inspired and modeled after Caffo‘s bookshelf. Of course, his system is prettier–and faster too, at the moment.

Some caveats about the code:

  • It’s running on a very old and underpowered server.
  • This is an alpha version so don’t expect subtleties in the code. I’m still learning Seaside, and migrating from 2.6 to 2.8 proved an interesting exercise.
  • The application depends on an instance of [GOODS]. The connection data for the instance can be configured in the application settings.
  • The login is a beautiful example of how things should not be done. I started with a normal login system, got lazy in the process, and adapted it to allow just one user to log. The user can be configured in the application settings as well.
  • I’m not using any deployment optimizations. Everything is in memory, and thumbnails are generated on the fly.
  • The code is Squeak-specific.

That said, the system shows how a Seaside application runs, and how Magritte can be used to model data. It’s enough to show how Seaside is different from any other of the usual Web frameworks in use today.

The code can be found below:

While the code will run in any Squeak 3.9 image, I recommend Damien Cassou’s Squeak-Web image. With his latest image, it’s just a matter of loading GOODS and the code to begin development. GOODS configuration, of course, is left as an exercise to the reader.

Obama ahead

February 13th, 2008 § 0 comments § permalink

I’m impressed with the extent of Obama victories in the past weeks. Not only he is surpassing the margins predicted, but those numbers are predicated on people who analysts were sure would vote for Hillary–like the Latino population in Virginia, for example.

I confess I’m still trying to decide whether people are voting for Obama because although black he’s still a man. I’m not naïve enough to presume all voters are choosing him because he’s more qualified, especially considering the Latino population–at least here in Brazil–has always been strongly sexist where woman in leadership positions are concerned.

Of course, part of this will be tested when Obama–as it seems likely now–goes against the Republican candidate in November. Considering the way the Republican nomination is going, and that many Republicans are actually voting for Obama, the outcome of the election will demonstrate how ready the American people is ready for a president unlike any other in history.

I’m hoping that Obama wins. Not because Hillary is a woman but because I dislike the way she abides to the old school of politics. America needs a new face and Obama would clearly resonate a lot better with the rest of the world.

Software Craftsmanship

February 7th, 2008 § 4 comments § permalink

Recently, somebody recommended me to take a look at Software Craftsmanship, by Pete McBreen, as a good treatment of software engineering versus software craftsmanship as approaches for software development.

The theme is indeed interesting, but I was surprised to see how badly the book is written. McBreen, granted, does a decent job of presenting the main arguments for both sides–which is more than you would expect from a proponent of a specific approach–but he also repeats those same arguments endlessly. I don’t know how an editor managed to let something like that happen, but if the incessant repetition were to be eliminated the book would lose at least three quarters of its almost three hundred pages.

McBreen’s argument is simple: Software engineering is appropriate only for huge projects (those is the 100 developer-years and above). For simple projects, needing faster development and no critical hardware infrastructure, the old concept of craftsmanship is much more interesting: a master craftsman running a team of journeyman and apprentices.

I agree with the arguments and many of the other conclusions presented my McBreen. In fact, to the extent I’m concerned, that’s exactly the way I’ve been running my own small company. The results, so far, have been excellent.

Mane people reading the book, however, will quickly give up after reading two or three entire chapters essentially saying the same thing. They won’t look kindly also, to statements like the one below:

Software craftsmanship is the new imperative because many members of the software development community are stating to chase technology for its own sake, forgetting what is important.

The fact that the second part of that sentence is painfully obvious and that the relationship between the first and second part is clearly a non-sequitur doesn’t seem to bother McBreen, tough.

Nevertheless, much of what McBreen is talking about is valid and necessary, as when he describes how good enough software is not really good to users or the industry. Some of this analysis of the prevalent (and wrong) metaphors–like car building versus car design–were interesting enough to motivate me to finish the book.

Ultimately, the book is necessary and part of one of the most important debates taking place today in the industry. I’m afraid, however, that many readers will abandon the book after a couple chapters after being put off by McBreen’s redundant style. More’s the pity because a good editor would made the book the new Peopleware.

Online Primary

January 30th, 2008 § 3 comments § permalink

Via Read/Write Web, an interesting site that simulates the US primary: Online Primary. According to the article, the site is an experiment in online elections–something I believe we are very far from being able to do, not because the technology is not here but because we can’t trust the established providers.

Very few people have “voted” using the site so far but it’s interesting to see that Democrats seems to be more likely to engage in such experiments and it’s also interesting to note the Barack Obama almost doesn’t have votes in the “Anybody But” category.

The Democratic nomination being about people rejecting a candidate and not about people choosing one, it will be fascinating to follow the site and see the nascent trends.

Living in Code

January 27th, 2008 § 0 comments § permalink

God said, “Cancel Program GENESIS.” The universe ceased to exist.

Arthur C. Clarke

The Universe as a virtual machine or a simulation is a very old idea. Even outside of science, many culture have thought of the material existence as the dream of a god.

Christianity has always dealt introspectively with questions about the relationship of God and the Universe. For example, if only God existed before time and space, where is the Universe in relation to God, and whether the Divinity had to limit Himself when He created the Universe.

More recently, with the rise of favorable conditions, many people have started to devote more time to this kind of exploration–which is a very natural curiosity I may add, considering we all want to know how and why we are here, in this particular time and state.

So it’s no surprise that a recent article, The Physical World as a Virtual Reality, attempts to frame the virtual reality question in light of modern physics. The article is the result of a scientist’s exploration about the implications of a virtual Universe within our current physics framework.

It’s a fascinating reading although no conclusion is given, and no attempt is made to create any mathematical models around the questions present–something I doubt is possible now, and which, in fact, may not be ever possible. Of course, if we were able to prove that the Universe is a simulation, the implications would be civilization-changing (Simulacron-3 / The 13th Floor are very good fictional explorations of those themes).

But more interesting than that would be attempts to hack the code of the Universe, changing and introducing new laws. One could imagine an infinite series of Universes, each running their own giant simulations and experiments.

Of course, that begs the ultimate questions: if we are in a simulation, which form do take blue screens of death?

Kenna

January 22nd, 2008 § 0 comments § permalink

In Blink, Malcolm Gladwell uses musician Kenna as an example of good music that is not marketable because marketing people can’t usually recognize it as good but who knowledgeable music lovers will love. According to Gladwell, this is an example where just sampling something will not yield accurate results when using intuitive expertise.

Regardless of Gladwell’s conclusions, I decided to try and hear a bit of Kenna’s music–and I was floored. Kenna’s music is unclassifiable. It’s a powerful mix of many styles, so well matched that one can’t help but listen endlessly to the variations just one song can provide.

I’ve been listening to the songs of the Make Sure They See My Hands, and the variation is unbelievable. Daylight, the album’s first song, for example, opens with a very New Age intro, evolves to a mix of soul and eletronica, and finally becomes an operatic rock song. Be Still, on the other hand, has the melodic roots of a traditional rock son but also includes a soft blend of synth pop that makes it unforgettable.

All other music share this kind of diversity, using a mix of hip hop, house, synth pop, eletronica, soul, rock, and many other styles that will probably satisfy even the most demanding music lover. The U2 influence is evident in many songs (Sun Red, Sky Blue could have been a U2 song) and I liked that aspect of Kenna’s music as well.

In short, a very worth addition to my music library.

Needed: a new paradigm for Web development

January 18th, 2008 § 0 comments § permalink

In the past few days I have been thinking about the future of development–especially about the growing interest in tests and domain specific languages, and about the new trends in Web development. I was surprise to realize that, despite the fact the much talking has been done about how they may revolutionize the field, no significant application or framework is combining those concepts in something truly new.

The historic of the field is abysmal. We are now forty years into a period of very few changes in the conceptual basis of software development. For twenty years we have been using basically the same tools, practicing the same moves, and not moving are all. The industry remains bound to the minefield of object oriented programming, relational databases, and bottom-up design.

With regards to Web development, for example, although innovative in many ways, Rails and Django share two flaws that will make them obsolete as quickly as the other many frameworks that have graced the field in the last decade.

The first flaw is conceptual fragmentation. In an attempt to make Web development “painless”, those two frameworks and their descendants have diluted the way the application domain is considered in the application. It’s a more manageable–dumbed-down, if you will, way to develop application but the disconnection between code and domain is fairly evident.

The second flaw is the fixation of opinionated solutions. The use of REST by Rails is a good example of this kind of fixation. REST is a very useful concept, even necessary for some applications, but Rails half-baked external solution, full of accessory tricks, is sub-optimal. But Rails developers are sticking to it without questioning what it represents for their applications because Rails is opinionated software.

In fact, many of those so-called modern frameworks are just pretending complexity does not exist or that they can be easily managed by the use of small methods and a couple of tests.

Test-driven development is now being considered a silver bullet. New developers are using it as a panacea–as a way to guide design as if it would be possible to analyze the problem domain of the application by peering at the small window offered by TDD. The enormous amount of examples showing how to test what has been already tested is quite insane.

Seaside, which I tend to defend as a next step in Web development because of its innovative application of old concepts and its refusal to submit to the common solutions, is not the full solution though. Is great, it’s necessary, but it is still a step below what we really need.

Hopefully, the interest in concepts like language oriented programming will invite attempts to solve the Web development problem is new ways that will transform the field until the next revolution is needed.

Maybe we need a way to generate executable specifications that will really a way to build applications, and not a inferior way to model the expected behavior of an application. Maybe that can be a New-Year resolution: to think of a way to connect the dots, to join the loose treads created in the past twenty years. Is anybody up to it?

Ted Nasmith’s illustrations

January 16th, 2008 § 0 comments § permalink

Ted Nasmith is one of my favorite artists. My first contact with his works came through a friend who introduced me to his illustrations based on the books by J. R. R. Tolkien. Nasmith is one of the best illustrators of the Professor’s work and his drawings and paintings have graced dozens of publications. In fact, many of the scenes in the movie version of The Lord of the Rings are directly taken from his work.

A couple days ago I found that he is drawing new pieces based on the books in George R. R. Martin‘s excellent series, A Song of Ice and Fire. The few drawings displayed on the site are incredibly beautiful and evocative.

Whether you are a fan or not, you will not regret spending a few minutes on Nasmith’s site. In fact, it’s very likely you will spend a lot of time. It will be worth your while.

Autonomic Debugging

January 15th, 2008 § 0 comments § permalink

I’m reading Blink, by Malcolm Gladwell, which is about the ability to arrive at correct decisions from minimal information–in other words, in a instinctive or intuitive way. I’ll write more about the book later, by I’ve been thinking about Gladwell’s argument and how it would apply in the field of software development.

What occurred to me is that experienced programmers are able to the make the same kind of instantaneous judgments, especially when they are debugging a program. I can remember countless occasions in my programming career when the simple act of looking at the code, without even trying to read in detail what was written, would generate a clear picture of what was wrong with that specific part of the application.

I think any other programmer would be able to say the same. That ability seems to be a mix of general programming knowledge and specific application knowledge. And the longer you program, the better you will be at spotting problems in the presumed function and structure of the code. It doesn’t matter if the problem is simple–duplicate rows because of a missing join statement, for example–or complex–subtle behavior problems in the application because of slightly changed configuration parameters.

It’s interesting to compare the behavior of two differently experienced programmers. Curiously, I have been doing something like that for a while, even before I started to read the book, and I think Gladwell is quite right here. I don’t agree with many of his arguments in the book, but the basic relationship between expertise and intuition is something we often miss.

The converse is also interesting, the times when instinct fails. That may cause a programmer to spend hours looking for a ridiculously small problem–a wrong letter in a protocol definition that will prevent the entire program from working and a misleading error message. The fact the this kind of problem can be solved by falling back (taking some time away from the problem or using a second opinion) indicates that the mechanism is, to a certain extent, resettable.

Anyway, it’s quite interesting to think about the way our mind works and the ability it has to make those instantaneous comparisons and classifications.

Coding Elegance

January 11th, 2008 § 4 comments § permalink

The equivalence between elegance, beauty and correction is almost an axiom in the field of mathematics. Bertrand Russell expresses this correlation thus:

Mathematics, rightly viewed, possesses not only truth, but supreme beauty–a beauty cold and austere, like that of sculpture, without appeal to any part of our weaker nature, without the gorgeous trappings of painting or music, yet sublimely pure, and capable of a stern perfection such as only the greatest art can show. The true spirit of delight, the exaltation, the sense of being more than Man, which is the touchstone of the highest excellence, is to be found in mathematics as surely as poetry.

Bertrand Russel, The Study of Mathematics

Code, once we consider its mathematical roots, presents the same intrinsic correlation. Although it is too much mutable to evoke the cold and austere beauty to which Russell alludes, the fact that code and its other products exhibits the same aesthetic imperatives is obvious even to the most inexperienced programmers. Even users can occasionally apprehend those aspects of code when they talk about the way a given application works and how functional and usable it is.

Most of that elegance derives from the incremental economy one can achieve by successively refining a body of code. The author of The Little Prince describes those steps with the following words:

Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.

Antoine de Saint Exupéry, Terre des Hommes

Exupéry criterion is an excellent validation tool for what code should be–and by extension, any of its products–in its final form. There is beauty and perfection to be found in code, to borrow Russell words, as surely as there is beauty and perfection in the most cherished poems.

To the intelect of programmers, this beauty is visually clear in what they product, easily expressed in the successive reductions they can perform to achieve a core of functionality that will stand the test of time. Obviously, that perfection depends both on the programmer and the tools he chooses to employ, but it’s available to any practitioner of the craft willing to make the effort to become a master craftsman. As another great programmer said:

Ugly programs are like ugly suspension bridges: they’re much more liable to collapse than pretty ones, because the way humans (especially engineer-humans) perceive beauty is intimately related to our ability to process and understand complexity. A language that makes it hard to write elegant code makes it hard to write good code.

Eric S. Raymond

Being a function of a developed sense of programming, I believe it’s possible to purposefully chose to code beautifully. It’s a matter of time and options, something about every programmer should think regularly in the course of his career. Training oneself to recognize beauty may seem far fetched, considering that reading code is much harder than writing it, but that may be the key to the task: beautiful code will be much more readable than ugly code, and that will help programmers to identify and recognize good code.

Ultimately, the challenge of every programmers is to learn to code elegance, teaching himself or herself to recognize code that meets standards of concision, simplicity and beauty–which brings us to another quotation:

Simplicity carried to the extreme becomes elegance.

Jon Franklin

My advice to those who are beginning their programming careers and also to those who are feeling that their code is becoming bloated and unwieldy is this: train yourself to code in a way that will show the problem solving intent of each line, and that your code is the best way to solve the problem at hand.

In less time than you will realize, elegance will be second nature to you, with all benefits it brings. It’s hard work, but worth it.