Doing it with GroupOn

December 27th, 2010 § 2 comments § permalink

There is a big discussion going on about GroupOn‘s business model. After the company refused Google’s acquisition offer, nobody could decide whether the company owners were just too crazy or too brilliant, confident in their business ability to outperform any kind of offer Google could come up with–and that after Google virtually double its initial offer.

John Battelle is one of the ones who think that GroupOn made the right choice. He writes:

Good sources have told me that GroupOn is growing at 50 percent a month, with a revenue run rate of nearly $2 billion a year (based on last month’s revenues). By next month, that run rate may well hit $2.7 billion. The month after that, should the growth continue, the run rate would clear $4 billion.

Battelle attributes this to a combination of factors (relationships, location, and timing–see the article for a more in-depth explanation) that make GroupOn’s appeal to small business pretty much irresistible. As he notes in his article, that run rate is the triple of what Google itself experienced in its early years.

I was talking to a friend a while back about social buying and he said the major problem that will affect and eventually kill GroupOn’s–and all of it other clones, by extension–is churn, that is, the fact the a lot of the offers were creating problems for the businesses using the platform. In fact, there have been a lot of reports about people being mistreated if they came bearing a GroupOn or equivalent coupon, and I have heard some of those stories first-hand. In many of those cases, the business owners had miscalculated what they could or should offer and were unhappy with the entire experience, consequently becoming less and less interested in working again with GroupOn or their local clone.

But I believe that the churn we are seeing right now is just a consequence of the way new markets behave. If Battelle is right–and I believe he is–the rate of churn will fall with time as business begin to find their sweet spots in the social buying ecosystem. I don’t see why, for example, GroupOn can’t offer a tool that will allow business to input some parameters and find the ideal price for a given offer. Granted, that will never be exactly precise but will give most business owners using the platform a way to avoid the most extreme problems.

But, ultimately, I believe that GroupOn will succeed because it’s changing the way people are relating to those business using social buying to attract them. Recently, talking to two other friends, they told me how our local clones had impacted their buying patterns.

One of them, a 40-ish divorced guy, said he wouldn’t dine out anymore unless he had a coupon and that the coupons were helping him to raise the bar with regards the kind of places he was able to go to in a single month. Previously, going to a more expensive place meant he was able to do that just a couple times each month. With the help of coupons, he was going to more expensive places one or more times each week. That’s a huge change in spending patterns and one that’s benefitting him and the restaurants he likes.

But other guy, a 30-year-old or so guy, said social buying was actually helping him to get laid. You see, this is a single guy who is using a variety of coupons–for restaurants, spas, clothing, small items–to impress and convince women to have sex with him. He is still using a considerable amount of money, but GroupOn and the likes are helping him to spend that money in a more efficient way towards his objects–which, right now, are pretty much limited to getting laid as many times as possible with as many women as possible. And it’s pretty evident from the way the market works that any business that helps people to get laid–or find any measure of sexual satisfaction for the matter–is in much better condition to thrive.

So there you have it: people are getting laid using GroupOn. That makes GroupOn business position a much stronger one. Battelle is right, from the small business’ point of view–but my friend is also right, from the consumer’s point of view.

Either way, GroupOn wins.

Needed: a new paradigm for Web development

January 18th, 2008 § 0 comments § permalink

In the past few days I have been thinking about the future of development–especially about the growing interest in tests and domain specific languages, and about the new trends in Web development. I was surprise to realize that, despite the fact the much talking has been done about how they may revolutionize the field, no significant application or framework is combining those concepts in something truly new.

The historic of the field is abysmal. We are now forty years into a period of very few changes in the conceptual basis of software development. For twenty years we have been using basically the same tools, practicing the same moves, and not moving are all. The industry remains bound to the minefield of object oriented programming, relational databases, and bottom-up design.

With regards to Web development, for example, although innovative in many ways, Rails and Django share two flaws that will make them obsolete as quickly as the other many frameworks that have graced the field in the last decade.

The first flaw is conceptual fragmentation. In an attempt to make Web development “painless”, those two frameworks and their descendants have diluted the way the application domain is considered in the application. It’s a more manageable–dumbed-down, if you will, way to develop application but the disconnection between code and domain is fairly evident.

The second flaw is the fixation of opinionated solutions. The use of REST by Rails is a good example of this kind of fixation. REST is a very useful concept, even necessary for some applications, but Rails half-baked external solution, full of accessory tricks, is sub-optimal. But Rails developers are sticking to it without questioning what it represents for their applications because Rails is opinionated software.

In fact, many of those so-called modern frameworks are just pretending complexity does not exist or that they can be easily managed by the use of small methods and a couple of tests.

Test-driven development is now being considered a silver bullet. New developers are using it as a panacea–as a way to guide design as if it would be possible to analyze the problem domain of the application by peering at the small window offered by TDD. The enormous amount of examples showing how to test what has been already tested is quite insane.

Seaside, which I tend to defend as a next step in Web development because of its innovative application of old concepts and its refusal to submit to the common solutions, is not the full solution though. Is great, it’s necessary, but it is still a step below what we really need.

Hopefully, the interest in concepts like language oriented programming will invite attempts to solve the Web development problem is new ways that will transform the field until the next revolution is needed.

Maybe we need a way to generate executable specifications that will really a way to build applications, and not a inferior way to model the expected behavior of an application. Maybe that can be a New-Year resolution: to think of a way to connect the dots, to join the loose treads created in the past twenty years. Is anybody up to it?

More companies join DataPortability

January 10th, 2008 § 4 comments § permalink

Two days ago, Google and Facebook shook the industry when they decided to join the DataPortability Workgroup, as enthusiastically reported by Read/Write Web. Today, with the same enthusiasm–and not without a certain sense of disbelief–Read/Write Web is reporting that three more big players have joined the initiative: Flickr, SixApart and LinkedIn. As with Google and Facebook, their representatives will not be mere employees but people with a history of involvement with the issues sponsored by DataPortability.

It’s quite obvious Google and Facebook move was what prompted those three companies to join the group. Likewise, we can certainly expect more companies to join the group in the coming days and months. So what began as an idea to provide guidance to the industry may become a real force for the implementation of portability standards in the next years.

The more important thing about the whole thing is that all standards supported by the DataPortability Workgroup are open and, together, represent a natural deterrent against the kind of attitude we often see expressed by Microsoft, that is, embracing standards and later changing them to make them slightly incompatible with other implementations to keep their dominant position in the industry.

This year seems more and more promising for open standards. OpenId is being discussed and implemented by lots of applications, and much more is happening each day. The next couple of months may eventually become marks in the history of the Internet. I certainly hope so.

Facebook and Google join the DataPortability Workgroup

January 8th, 2008 § 0 comments § permalink

Read/Write Web is reporting–with great enthusiasm, I may add–the both Google and Facebook have agreed to join the DataPortability Workgroup. I have not found any evidence of the news on the DataPortability site itself, but coming from Read/Write Web there is no chance the news are no true. It’s interesting to realize the announcement came just a few days after the controversy around the removal of Scoble’s account from Facebook.

The DataPortability group is an initiative working to promote the reuse and transparent transference of user data across Web applications. This is a huge challenge, but it’s good to see some smart people working on the multiple issues involved. The presence of two of the most social companies in existence today will certainly give the project a new measure of legitimacy–especially considering the people who will represent those two companies on the group.

One of the early documentos created by the group shows that one of the primary objectives of the group is to use existing technology to leverage the transformations needed to reach their goal. This is a good strategy since it improves the chances the recommendations made by the group will be followed and that they will be easy to implement.

With Google and Facebook joining DataPortabily, the eventual creation of a portability API may represent a turning pointing for Web applications as it will empower users and create an entire new industry around the possibilities of managing such data. Of course, the entire problem revolves around issues of privacy and that will be a tough nut to crack. Hopefully, with Google and Facebook, and the other companies that will surely follow their lead, working on the problem, an interesting and useful solution may arrive soon.

As the Read/Write piece says, that may represent a magical time for the development Web. That’s what we are hoping for, anyway.

RIA in 2008

January 6th, 2008 § 0 comments § permalink

Tim Bray began his predictions for 2008 saying that this is the decisive year for RIA applications: either they become mainstream or they will be relegated to the dust bin of history. Given the news about Microsoft planning to overhaul its entire site to show their RIA platform, Tim Bray is probably right in saying this will be a important year for RIA technologies.

Bray makes an interesting point when he says that he tends to associated “richness” not with interface–which, he also says, is something only developers care about–but with the interactive capabilities of applications, regardless of the technology they use. I agree, but I also think that Silverlight and Flex (and similar technologies) may have a useful role in a different place, providing different levels of interface in a very specific class of applications: internal sites.

Obviously, Microsoft and Adobe are setting their sights much higher than that. The former with its pathological need to control the industry; the latter, with its duplicity about open sourcing its products. I’m not worried. Public facing applications have different interaction and accessibility needs, and no developer is going to use technologies that will actively harm their applications on those two areas. One of the main problems plaguing alternative interfaces is that they are always trying to catch up with what users have grown used to and they can never succeed. Between dealing with the cognitive dissonance they force users to experience and dealing with multiple hosting system, they don’t have the leverage to compete with the advances being made in JavaScript integration.

Another interesting point Tim Bray makes is that most applications are Web-enabled to some extent–even if users don’t realize it. Add that to the growing research in offline/online integration and we are dealing with an entirely different playing field.

Contrary to Bray, I will risk a definitive prediction: RIA, with regards to Flex and Silverlight, will indeed be recognized as a secondary option this year, and no big applications–Microsoft site notwithstanding–will be launched using either technology. Conversely, we will see people using Silverlight or Flex in internal applications.

The rest of the year, however, will belong to Ajax.

Naked Day

April 5th, 2006 § Comments Off on Naked Day § permalink

Utterly naked, and not so proud of it since my WordPress theme has navigation coming before the content and no link to skip it.

RSS as a platform

March 18th, 2006 § Comments Off on RSS as a platform § permalink

With the recent developments in the RSS world, including the launching of Windows RSS Plataform, the discussions about the use of the format as real platform is undergoing a change. Now, there is a lot of talk about how developers can maximize the potential of the format e how they can solve the existing infrastructure problems.

In all that I have read, I didn’t see much discussion about mutable RSS feeds, that is, interactive feeds that allows users to pass data through the aggregator itself, changing the future behavior of the feed based on their choices.

Obviously, support for such interactivity isn’t present in the current crop of aggregators — at least, in none of the many I know and/or tried. In fact, a persistent fear of security problems that could be caused by such interactivity pervades the entire area. XSS and similar exploits caused most aggregators developers to completly eliminate the use of objects, forms and JavaScript inside RSS feeds.

Such an approach puts extreme limits on what you can do with RSS, of course. More than an year ago, answering a question posed by a friend of mine, I wrote here about interactive RSS feeds. The application I designed to test the concept at the time (a very simple prototype) is still running and can be accessed in the test area of this site. It’s a RS feeds that instead of simply presenting content, allows users to act on its entries. Given the limitations present in the current generation of feed readers, you will probably need to open each entry in the browser to see the feed in action.

The big question is: what can we really do with RSS? Is a read-only platform enough? I don’t think so. Considering the context in which I created the application mentioned above, that of a course served through RSS, a read-only plataform is not that interesting. A typical course has an activity tree that’s completely dependent on the students’ choices. A read-only format would not provide a complete experience in such a scenario.

As mentioned before, there are real security concerns involved in allowing users to interact with a feed. Allowing any kind of content can lead to episodes like the one caused by Mark Pilgrim a couple years ago, whose RSS feed “took control” of hosting computer with a clever use of HTML. The text he wrote later about the subject impacted the development of an entire generation of aggregators. Yet, browsers handle the same issues today and — despite some problems — they do just fine.

Before I start repeating what I already said in the other article, I believe RSS can evolve a lot beyond what it is today. New applications — especially in the much hyped Web 2.0 style — depend on a bigger possibility of interaction than that offered by aggregators today. Since the competition in the area seems to be big, I guess it won’t take long until we see changes.

Ajax mistakes

June 1st, 2005 § Comments Off on Ajax mistakes § permalink

A couple weeks ago, Alex Bosworth wrote about some problems resulting from the indiscriminate use of Ajax, listing some common errors he found in some Ajax applications.

It’s a good list, and it shows why the use of Ajax must be given the same attention we have learned to dedicate to other Web techniques. As I wrote before, Aja can create problems in the use of Web standards — it’s easy, for example, to forget about accessibility when your are building a new, shiny Ajax application, since accessibility demands more of such applications.

I’m experimenting a lot now with the mobile Internet (I’ll write more about that later), and the only Ajax application I used that degraded nicely in all mobile dispositives I tested recently was Google Mail, which is as usable and acessible in a mobile phone browser as it is in a desktop browser.

The main problem with Ajax, in any case, is that it required yet another step towards graceful failure. A normal Web application, built around normal forms and pages, already has its own needs to be able to fail gracefully (for example, in browser without CSS or JavaScript support, or when running in strange video resolutions). Ajax requires another layer of failure handling on top of what is required now, to deal with extremely different run-time conditions — on one side, the page must be able to run as a normal Web page, built around normal HTML and CSS; on ther other side, the page must behave as a rich application, able to modify itself dynamically, with much more flexibility.

Those conflicting requirementes are at the root of almost problems listed by Bosworth. Others, like breaking the “Back” button, are just a natural consequence of Web applications, maximized by the use of Ajax. I have yet to see an popular Web framework that handles the “Back” button in the correct way. Maybe it’s time to forget this button and provide alternative solutions, like cumulative Undo (or even the elimination of Save buttons, like Alan Cooper proposed).

Anyway, seeing Ajax gain acceptance as a new word for an old technology that is just coming of age now, amidst the rise of new frameworks, libraries and techniques, has been an interesting experience. I just hope, again, that we don’t lose the benefits we fought so hard to secure in the form of the now prevalent use of Web standards.

Acid2

April 15th, 2005 § Comments Off on Acid2 § permalink

The Web Standards project has finally released its new Web standards test, Acid2, which is designed to check the correct implementation of advanced HTML, CSS, and PNG features in modern browsers. Quite predictably, all browsers flunk the test.

Gecko, the rendering engine behind Mozilla, fails considerably in some areas, although it visually aproximates the correct result. I was surprised at how hard Internet Explorer’s rendering engine fails. The displayed page in IE has nothing whatsoever to do with the right result.

Impressive.

That made think how little we use CSS’s advanced features, considering the wildly differing CSS implementations, which, as shown by the test, have even more problems than previously thought.

The work to correct the flaws has already started. Dave Hyatt, Safari’s primary developer, has fixed some bugs in that browser’s rendering engine, and explains what he did in his blog. Mozilla will probably follow Safari’s lead soon. What’s left to know is if and how Internet Explorer will fix the problems, considering the uncertainty behind its development — even if the team behind it was reassembled, they don’t seem particularly willing to rework IE’s rendering engine.

Anyway, Acid2 is a good and interesting piece of work, and I’m sure the browser market, given time, will be better because of it, especially now that people are becoming more aware of the power of development techniques like Ajax.

By the way, if you want to learn how the test works, there’s a page with detailed information about it.

20 million firefoxes

January 24th, 2005 § Comments Off on 20 million firefoxes § permalink

Mozilla Firefox has reached the impressive total of 20 million downloads for version 1.0, which was released in last November. I’m happy to hear that, especially considering the trouble I had to convince many co-workers, friends, and relatives to switch to Firefox.

20 million is a big number. Brazilian Internet users number only 14 million, and Brazil is very Internet-aware (ironic as that may seem). Nothing bad for a browser that doesn’t come with the operating system.

Where Am I?

You are currently browsing the Web category at Reflective Surface.