domingo, octubre 31, 2010

Oyendo reflexiones sobre el software futuro

Mencionado por Jean Bezivin en Twitter: algunas reflexiones de Michael Nygard acerca del futuro del desarrollo de software, que suenan razonables, aunque quizá a Java le reste más (y mejor) recorrido que el que Michael estima. Lo fundamental, la computación móvil y su asociada, la "computación en la nube", aparecen forzando la marcha según sus requerimientos:
  • Two obvious trends are cloud computing and mobile access. They are complementary. As the number of people and devices on the net increases, our ability to shape traffic on the demand side gets worse. Spikes in demand will happen faster and reach higher levels over time. Mobile devices exacerbate the demand side problems by greatly increasing both the number of people on the net and the fraction of their time they are able to access it.
  • Large traffic volumes both create and demand large data. Our tools for processing tera- and petabyte datasets will improve dramatically. Map/Reduce computing (a la Hadoop) has created attention and excitement in this space, but it is ultimately just one tool among many. We need better languages to help us think and express large data problems. In particular, we need a language that makes big data processing accessible to people with little background in statistics or algorithms.
  • Speaking of languages, many of the problems we face today cannot be solved inside a single language or application. The behavior of a web site today cannot be adequately explained or reasoned about just by examining the application code. Instead, a site picks up attributes of behavior from a multitude of sources: application code, web server configuration, edge caching servers, data grid servers, offline or asynchronous processing, machine learning elements, active network devices (such as application firewalls), and data stores. "Programming" as we would describe it today--coding application behavior in a request handler--defines a diminishing portion of the behavior. We lack tools or languages to express and reason about these distributed, extended, fragmented systems. Consequently, it is difficult to predict the functionality, performance, capacity, scalability, and availability of these systems.
  • Some of this will be mitigated naturally as application-specific functions disappear into tools and frameworks. Companies innovating at the leading edge of scalability today are doing things in application-specific behavior to compensate for deficiencies in tools and platforms. For example, caching servers could arguably disappear into storage engines and no-one would complain. In other words, don't count the database vendors out yet. You'll see key-value stores and in-memory data grid features popping up in relational databases any day now.
  • In general, it appears that Objects will diminish as a programming paradigm. Object-oriented programming will still exist... I'm not claiming "the death of objects" or something silly like that. However, OO will become just one more paradigm among several, rather than the dominant paradigm it has been for the last 15 years. "Object oriented" will no longer be synonymous with "good".
  • Some people have talked about "polyglot programming". I think this is a red herring. Polylgot is a reality, but it should not be a goal. That is, programmers should know many languages and paradigms, but deliberately mixing languages in a single application should be avoided. What I think we will find instead is mixing of paradigms, supported by a single primary language, with adjunct languages used only as needed for specialized functions. For example, an application written in Scala may mix OO, functional, and actor-based concepts, and it may have portions of behavior expressed in SQL and Javascript. Nevertheless, it will still primarily be a Scala application. The fact that Groovy, Scala, Clojure, and Java all run on Java Virtual Machine shouldn't mislead us into thinking that they are interchangeable... or even interoperable!
  • Regarding Java. I fear that Java will have to be abandoned to the "Enterprise Development" world. It will be relegated to the hands of cut-rate business coders bashing out their gray business applications for $30 / hour. We've passed the tipping point on this one. We used to joke that Java would be the next COBOL, but that doesn't seem as funny now that it's true. Java will continue to exist. Millions of lines of it will be written each year. It won't be the driver of innovation, though. As individual programmers, I'd recommend that you learn another language immediately and differentiate yourself from the hordes of low-skill, low-rent outsource coders that will service the mainstream Java consumer.
  • To my great surprise, data storage has become a hotbed of innovation in the last few years. Some of this is driven by the high-scalability fetishists, which is probably the wrong reason for 98% of companies and teams. However, innovations around column stores, graph databases, and key-value stores offer developers new tools to reduce the impedance mismatch between their data storage and their programming language. We spent twenty years trying to squeeze objects into relational databases. Aside from the object databases, which were an early casualty of Oracle's ascension, we mostly focused on changing the application code through framework after framework and ORM after ORM. It's refreshing to see storage models that are easier to use and easier to modify.

sábado, octubre 30, 2010

El peso de Internet en la economía inglesa

Estudio de Google comentado por BBC: Internet representaría el 7,2% del producto bruto del Reino Unido, lo que convierte a los negocios en y con la red en su quinto sector por volúmen. El estudio fue encargado por Google al Boston Consulting Group. Dice BBC:
UK internet economy 'worth billions'
The internet is worth £100bn a year to the UK economy, a study has concluded.
The research, which was commissioned by Google, found that the internet accounts for 7.2% of the UK's gross domestic product (GDP).
If the internet was an economic sector it would be the UK's fifth largest, said the report from the Boston Consulting Group (BCG).
This would make the sector larger than the construction, transport and utilities industries.
Central pillar
Some 60% of the £100bn a year figure is made up from internet consumption - the amount that users spend on online shopping and on the cost of their connections and devices to access the web.
The rest comes from investment in the UK's internet infrastructure, government IT spending and net exports.
The report, The Connected Kingdom: How the internet is transforming the UK, says that the internet's contribution to GDP is set to grow by about 10% annually, reaching 10% of GDP by 2015.
The UK, according to the report, is the world's leading nation for e-commerce. For every £1 spent online to import goods, £2.80 is exported.
"This is the opposite of the trend seen in the offline economy, which exports 90p for every £1 imported," the report says.
Internet companies play a vital role in employment with an estimated 250,000 staff, the report finds.
Small businesses that actively use the internet report sales growth more than four times greater than that of less active companies.
The report also attempts to compare the UK to other countries in the Organisation for Economic Co-operation and Development (OECD).
Under its scoring system, the UK ranks sixth, above Germany, the US and France. The highest ranked country is Denmark.
"The internet is pervasive in the UK economy today, more so than in most advanced countries," said Paul Zwillenberg, partner with BCG.
"Several industries - including media, travel, insurance and fashion - are being transformed by it."
Matt Brittin, managing director of Google UK, said: "The internet is a central pillar of the UK's economy.
"The sector has come of age, and with great prospects for further growth the UK internet economy will be vital to the UK's future prosperity," he added.
 El estudio, además de analizar con detalle su participación interna, compara  Reino Unido al resto de países que encabezan el uso de Internet. Entre ellos, y en posición relegada (alrededor del vigésimo puesto), sólo España se gana un lugar entre las naciones iberoamericanas.

miércoles, octubre 27, 2010

TPS: su historia en un gráfico

Pete Abilla, lectura frecuente sobre Lean Manufacturing, reproduce una historia gráfica de los hitos fundamentales de la génesis del sistema Toyota (TPS), acompañada de una descripción algo más extensa. Para quienes tengan interés en la historia de su sistema de producción y sus raíces.
El original de esta historia tiene dos fuentes: la línea histórica textual publicada en Superfactory (ya comentada en otro momento aquí), y el grafico generado sobre su base por Rick Ladd.

domingo, octubre 24, 2010

Estrategias de inversión en España

Hace ya varios días comenta Javier Garzás, de Kibele Consulting, a propósito de la actualidad de los planes estratégicos españoles en tecnología:
En España, el Ministerio de Industria recortará un 24% sus inversiones a las TIC en 2011. Era de esperar. Se veía venir. Terminaron las inversiones importantes del estado en tecnología. Y, ojala me equivoque, pero aunque dicen que mantendrán los planes avanza, aquellos donde se incluían ayudas para la certificación de empresas en CMMI, ISO 15504, ISO 20000 e ISO 27001, en los próximos años va a rozar lo imposible obtener una ayuda. Y eso que decían que querían cambiar el modelo productivo de España. Pues no será pensando en la tecnología, que con el 24% de recorte, lidera la reducción de presupuestos por delante de turismo (un recorte del 16,3%), industria (11,8%), energía (17,2%) y, atentos, la subida del 23,4% en comercio exterior.
Según leo por ahí, las razones que da el Gobierno son sus nuevos objetivos de “austeridad, eficiencia en el gasto público e impulso al cambio de patrón de crecimiento productivo”. ¿Y cuál será el plan para cambiar el modelo productivo? Si por los recortes parece que las TIC no entran en el nuevo modelo productivo… ¿será un nuevo plan E? En serio, que nadie se ría, que aquel famoso y “brillante” plan E supuso 5000 millones euros, destinados muchos de ellos a quitar aceras para luego volverlas a poner (o incluso para crear discotecas), y en 2011 el presupuesto de la Secretaría de Estado de Telecomunicaciones será de 1116 millones. Sin duda una clara estrategia para cambiar el modelo productivo de España.
Profundo error estratégico gubernamental. Así lo hizo notar a nivel bancario Emilio Botín, presidente del Banco Santander y Universia:  "sería un error" no hacer un esfuerzo para incrementar las inversiones en educación universitaria y en investigación e innovación porque, según ha afirmado, "la sociedad española se enfrenta a desafíos importantes, propios y globales, cuyo desenlace condicionará el futuro de las nuevas generaciones" (...) "Los países son aún más conscientes de que de la necesidad de impulsar las universidades hacia la excelencia y reforzar su papel en I+D+i", ha indicado el presidente de esta entidad financiera, para el que la cooperación entre universidades, empresas y administraciones "es cada vez más importante". (Comentado en El Economista).

miércoles, octubre 20, 2010

Estadísticas informales del predominio de Google (en Internet)

Leído en Denken Über: Pingdom publica algunas estadísticas no muy formales de distintas áreas de Internet que muestran un abrumador predominio de Google en áreas estratégicas: buscadores (usuales y móviles), publicidad, video, alimentadores de noticias, herramientas de análisis, mapas, etc. Una visión gráfica de su agresiva actividad de expansión de negocios. Un predominio que le puede hacer falta, pero  que no necesariamente significa hegemonía en la industria. Enfrente, la creciente oposición de sus competidores, cada vez más basada en la persecución judicial argumentando patentes, violaciones a la privacidad, dominio de mercado y otras. Una disputa de hegemonías que sería interesante recapitular. Trataremos de hacerlo...

sábado, octubre 16, 2010

Oracle/Java: Golpe de timón de IBM?

Leído primero en Infoq, y luego en todos lados...IBM anuncia que colaborará con Open Java Community (OpenJDK):
REDWOOD SHORES, CA & ARMONK, NY - 11 Oct 2010: Oracle (NASDAQ: ORCL) and IBM (NYSE: IBM) today announced that the companies will collaborate to allow developers and customers to build and innovate based on existing Java investments and the OpenJDK reference implementation. Specifically, the companies will collaborate in the OpenJDK community to develop the leading open source Java environment.
With today's news, the two companies will make the OpenJDK community the primary location for open source Java SE development. The Java Community Process (JCP) will continue to be the primary standards body for Java specification work and both companies will work to continue to enhance the JCP.
The collaboration will center on the OpenJDK project, the open source implementation of the Java Platform, Standard Edition (Java SE) specification, the Java Language, the Java Development Kit (JDK), and Java Runtime Environment (JRE).
Oracle and IBM will support the recently announced OpenJDK development roadmap, which accelerates the availability of Java SE across the open source community.
"The Java community is vital to the evolution of the Java platform," said Hasan Rizvi, senior vice president, Oracle. "The collaboration between Oracle and IBM builds on the success of OpenJDK as the primary development platform for Java SE."
"IBM, Oracle and other members of the Java community working collaboratively in OpenJDK will accelerate the innovation in the Java platform," said Rod Smith, vice president, emerging technologies, IBM. "Oracle and IBM's collaboration also signals to enterprise customers that they can continue to rely on the Java community to deliver more open, flexible and innovative new technologies to help grow their business."
Java is a general-purpose software development platform that is specifically designed to be open and enable application developers to "write once, run anywhere." The platform is most widely used in business software, web and mobile applications.
Una ola de comentarios sobre este movimiento. Existe coincidencia en que este acuerdo, más allá de todo lo positivo que pueda tener, también tiene un perdedor: la implementación de Java Apache Harmony, de la cual participa IBM y colisiona con OpenJDK. Dice InfoQ:
InfoQ asked if IBM would continue to support both Apache Harmony and the other ASF projects with which it is involved. Smith suggested that they would, but also made it clear that IBM would be shifting its development effort from Apache Harmony to the OpenJDK. As part of this, Smith noted that IBM could bring some innovations from the Harmony project across to the Java SE Reference Implementation.
¿Qué rumbo tomará Java ahora? InfoQ recoje algunas opiniones positivas:
De Mike Milinkovich, de la Fundación Eclipse:
Today’s announcement that IBM is going to join forces and work with Oracle on OpenJDK is good news for Java, and by extension for Eclipse. All of us who live within the Java ecosystem need to recognize that this fundamentally strengthens the platform, enhances the business value of Java and offers the hope of an increased pace of innovation.
Although it will take a while for all of the ramifications and reactions to become clear, at its face the announcement challenges the conventional wisdom that the future of Java is going to be a fractured one. Some recent examples of these expectations can be seen in blog posts like James Governor’s “Java: The Unipolar Moment, On distributed governance for distributed software” and Joseph Ottinger’s “The Future of Java: forking, death, or stasis”. When I read them just a short time ago, I thought they accurately reflected the likeliest outcomes for Java’s sure-to-be fractious future. Now I am much more optimistic that we can get back to innovation.
To me the overarching motivation is obvious. Both IBM and Oracle have a shared interest in assuring their enterprise customers that Java was, is and always will be the safe technology choice that they’ve been selling for the past ten to fifteen years. As much fun and excitement as a further escalation of the “Java Wars” would have been, both companies have a very large vested business interest in combining forces, closing ranks and focusing on reassuring their customers that Java should remain their platform of choice.
This announcement fundamentally alters the equation in at least three important ways.
  • The presumption of conflict: Implicit in almost all of the recent writings on the future of Java is the notion that IBM’s interests would lie in direct competition, if not outright conflict with Oracle’s. Many have been assuming that IBM would eventually snap and declare war on Oracle’s Java hegemony, with the battles being fought in places like OSGi, Apache and Eclipse. It is now apparent that is not going to happen. Furthermore, now that IBM is working with Oracle on OpenJDK, we can expect a lot more mutual support within the JCP on driving specifications, especially platform specifications, forward.
  • Oracle is focused on reviving the business of Java: In case you hadn’t noticed, Oracle’s stewardship of Java is going to be a significant departure from Sun’s. As Amy Fowler said…this is a practical company who isn’t suffering from an identity crisis and knows how to make money from software.” A couple of thoughts on the differences: First and foremost Oracle actually has resources to invest in moving Java forward, whereas Sun’s financial weakness prevented forward progress for at least the past three years. Second, Oracle is putting in place the software engineering discipline and process in place to ensure that future releases of Java can happen on a much more reliable and predictable timetable than Sun. Third, Oracle is large enough and confident enough in its execution that it is much more comfortable in striking business deals with its co-opetition such as IBM. It will be darn interesting to see if they are successful in signing up more participants down the road. And finally, there will be less talk about community-driven motivations and more focus on the business. In my opinion, all but the last of those are unequivocally positive. But Oracle’s current focus on the business at least offers the hope that it may pay community dividends down the road. It is a lot easier for large companies to consider community motivations when they’re profitable and feel that they have momentum on their side. The past couple of years of Java have been years of stalemate, lack of innovation and lost opportunities. Turning that around has to be job one if Oracle is going to see a return on its acquisition.
  • This is an inflection point in the Oracle-IBM relationship: If you think back a few years ago, IBM and BEA were two companies who competed fiercely in the Java marketplace, but managed to collaborate on many JCP specifications and in numerous open source projects at places such as Apache and Eclipse. It was a mature industry relationship. Maybe I’ve missed it, but I haven’t seen a similar historical pattern with IBM and Oracle, even after Oracle acquired BEA. This is an important step in the relationship between the two companies, at least in the Java space. Hopefully it is a harbinger of additional collaboration.
The big question is what are going to be the reactions of the other significant players in the Java ecosystem. The actions of Google, SAP and VMware in particular will all be interesting to watch.
 De Mark Reinhold, Chief Architect de Java Platform Group en Oracle:
I’m very pleased that IBM and Oracle are going to work more closely together, and that we’re going to do so in the OpenJDK Community. IBM engineers will soon be working directly alongside Oracle engineers, as well as many other contributors, on the Java SE Platform reference implementation— starting with JDK 7.
I expect IBM’s engineers to contribute primarily to the class libraries, working with the rest of us toward a common source base for use atop multiple Java virtual machines. We each have significant ongoing investments in our respective JVMs; that’s where most of the enterprise-level feature differentiation between our respective products is found, and it makes sense for that to continue. Focusing our efforts on a single source base for the class libraries will accelerate the overall rate of innovation in the JDK, improve quality and performance, and enhance compatibility across all implementations.
Our tighter collaboration will be evident not just in OpenJDK but also in the Java Community Process. IBM has endorsed Oracle’s proposal for Java SE 7 and Java SE 8, which already has strong support from across the community. We’ll also join forces to enhance the Java Community Process so that it remains the primary standards body for Java specifications.
This is excellent news, for the Java SE Platform and for OpenJDK. I’ve gotten to know many of IBM’s top Java engineers over the years, and I now look forward to working more closely with them.
También es muy importante la opinión de Bob Sutor, con larga responsabilidad en este área en IBM:
When people talk about open source, the notion of “forking” often comes up. The idea is that some folks are not happy with the direction in which a project is going, so they take a copy of the source code, come up with a new name, and set up shop elsewhere. This is no guarantee that the newly forked project will be successful, but it functions as an important escape valve for those who have donated time and effort to a community project and want to see the work done in what they believe is the right manner.
You less often hear about what I’ll call a “reverse fork”: people developing largely similar but separate projects who decide that they instead want to work together. They can do this for a variety of reasons but it all comes down to “burying the hatchet” or otherwise resolving their differences for the sake of the project.
(...) IBM will work with Oracle and the Java community to make OpenJDK the primary high performance open source runtime for Java. IBM will be shifting its development effort from the Apache Project Harmony to OpenJDK. For others who wish to do the same, we’ll work together to make the transition as easy as possible. IBM will still be vigorously involved in other Apache projects.
We think this is the pragmatic choice. It became clear to us that first Sun and then Oracle were never planning to make the important test and certification tests for Java, the Java SE TCK, available to Apache. We disagreed with this choice, but it was not ours to make. So rather than continue to drive Harmony as an unofficial and uncertified Java effort, we decided to shift direction and put our efforts into OpenJDK. Our involvement will not be casual as we plan to hold leadership positions and, with the other members of the community, fully expect to have a strong say in how the project is managed and in which technical direction it goes.
We also expect to see some long needed reforms in the JCP, the Java Community Process, to make it more democratic, transparent, and open. IBM and, indeed Oracle, have been lobbying for such transformations for years and we’re pleased to see them happening now. It’s time. Actually, it’s past time.
Ultimately this is about making Java more successful and pervasive than ever before. Java is not about any single company’s technical direction and it helps prevent lock-in. It runs on many, many different operating systems and hardware platforms. As a blatant plug, let me say Java runs exceptionally well on Linux and IBM’s System z, POWER, and Intel-based hardware. Indeed Java is one of the open standards that makes System z the amazingly interoperable platform that it is.
Java is about compatibility and always has been. It’s not been easy to maintain runtime environments that are consistent across platforms while exploiting the underlying features and performance advantages of those platforms. With this newly unified OpenJDK open source project, we can give customers the confidence they need to continue to invest in Java-based solutions knowing that they will get the best technology, the most important innovations, and the tightest collaboration among industry leaders.
We believe that this move to work together on OpenJDK is in the best interests of IBM’s customers and will help protect their investments in Java and IT technology based on it.
So to summarize my opinions on this: OpenJDK represents the best chance to provide a top notch unified open source runtime for Java; customers will benefit by having first class Java open standards developed collaboratively and constructively; and our energy will be focused on working together and optimizing our joint work, rather than wasting time on duplicative projects.

De las opiniones críticas, hay una especialmente didáctica: la de Paul Fremantle, participante de proyectos de la fundación Apache, y coincidente en puntos esenciales con Sutor:
This week IBM announced it would be supporting Oracle's OpenJDK. At first glance it seems like "Great!"
Isn't it good that two big supporters of Java are getting behind a single open source project?
Well, in my personal opinion, no. It is bad. Bad for Java. I'll try to explain why.

The first point is that IBM are not just saying they will support OpenJDK. They are also saying that are pulling effort out of Apache Harmony. Apache Harmony is a project to build an Open Source JVM under the Apache license, rather than the GPL which is the license under which OpenJDK is available.

Harmony significantly predates OpenJDK and parts of Harmony are widely distributed in Android phones. Unfortunately there is a huge cloud over Harmony right now, and this news just made that cloud a good deal blacker. The lack of some IBM committers on the project isn't the problem. Apache encourages enough diversity that projects live on when one company pulls out.

To understand the clouds over Harmony let's first look at the legal situation here. Intellectual property, as we all know, is protected by two main models: copyright and patents. Simplifying hugely, copyright is about copying code, patents about copying ideas.

Apache Harmony was designed and built as a clean room implementation of Java. So no code was copied from any existing copyrighted JVM. But that doesn't protect against copying ideas - because even if the developers came up with the same idea independently, the patent still applies.

So how do Open Source projects protect themselves against patent issues?

The main way is to work with Open Standards that are covered by Open Specification promises or "Royalty Free" patent licenses. This is where major IP owners such as IBM and Microsoft have stated that they will not exert patent rights over either Open Source or Open Standard implementations of a particular standard.

Likewise most new standards from organizations such as OASIS are built on a Royalty Free basis, which means that all the companies that helped author the standard offer a free patent license to anyone implementing the standard.

There is a model under which Sun (and now Oracle) offers protection from patent issues: The Java Specification patent grant says that as long as you fully implement the Java specification and pass the tests that prove it - the Technology Compatibility Kit (TCK) - then you have a perpetual royalty-free license to patent rights that Oracle has over Java.

This sounds great. Not only has Sun/Oracle has made available Java under an Open Source license (GPL), but if you don't want to use GPL you can simply write another JVM that conforms to the tests and you won't be sued for patent infringement. Perfect. So what on earth are those whiners at Apache bothered about?

Unfortunately it isn't quite so simple. I hope you are following me so far. All Apache Harmony needs to do to protect against patent suits is to pass the TCK. Can Apache Harmony pass the TCK? Well, yes and no. Would it pass the TCK if the tests were run? Probably. Can the tests be run? No.

The TCK is not available to Apache in a way that allows Apache to run it. The JDK is available as Open Source, but the TCK isn't. To protect against patent issues, you have to talk to Oracle and get the TCK. And they will only give it to Apache with restrictions. In particular restrictions of a kind called Field of Use (FOU) restrictions.

Ok, this is becoming overly legal. I apologise. But I think its important to understand this story, because this really gets to the heart of how open Java is.

Open Source is not just "published code". If you take an Open Source library, then you are allowed to redistribute the code without prejudice. This is key to Apache and the Apache license. So Apache can't build in restrictions on who is allowed to take Apache code or what they can do with it. The Apache license doesn't allow it.

Unfortunately, Sun - and now Oracle - have said that they will only give the TCK to Apache if it restricts how the Harmony code can be used. Effectively what Sun/Oracle is trying to say is that Harmony code cannot be used by mobile devices (like Android). If Apache were to go along with this, it would mean shipping Apache Harmony under a different license from the Apache License. And this would no longer be Open Source.

Why not? Well a key part of the definition of Open Source is that there is no restriction on the Fields of Endeavour. In other words, if Apache agrees to the FOU restrictions that Oracle insists on, then the result would be that Harmony would not be Open Source. Naturally Apache cannot agree to that.

Let's recap. Anyone can create an Open Source JVM, but they cannot get patent protection unless they agree to Oracle's FOU restrictions, at which point it is no longer Open Source. Therefore no-one but Oracle can create an Open Source JVM without fear of being sued.

Ok, this all sounds highly legalistic and possibly quite theoretical so far. That was until Oracle sued Google over patents breached by Android phones running code from Harmony. The gloves are off. And the real result of this is that the only Open Source JDK that you can rely on having a patent grant is the OpenJDK. And if I modify OpenJDK then I am at the mercy of Oracle to grant me a TCK license.

Unfortunately this is simply bad for Java. Java as a language is threatened by many other new and old languages. For many users its simply a commodity runtime that they will use as long as it is commonly available. And knowing there are Open Source implementations that they can use is part of that decision. Knowing that there is effectively only one Open Source project that is free from Oracle's patent claims will affect the perception and the reality of Java's openness. And for many people the fact that this is under the GPL is an issue. You can see why IBM joined OpenJDK: Oracle has Apache Harmony in a tight place.

If you want to know more - as well as hearing the official Apache line as well as my own, then please take a look at Apache's letter to Sun when this first happened.

When this first blew up Sun was a struggling company that you could have argued needed the extra revenue Java licensing to mobile phones bought them, and which Harmony and Android threatened.

The question before us now is whether the same is true of Oracle, and whether Oracle is working in the best interests of its customers, the Java community, and the Open Source community. Do you believe that Oracle should license the TCK under an open license? And are you happy that despite the move to take Java Open Source, there really is no freedom to create Open Source implementations of the Java language.

The good news for Harmony is that Apache's diversity approach means that IBM pulling out won't harm the future of the code. The bad news is that there is one less company putting pressure on Oracle to make Java truly open.
Que Oracle piensa hacer valer sus patentes, lo demuestra la demanda abierta a Google por Android. Más allá de que los defensores de la fundación Apache se vean particularmente afectados, los reales hechos parecen darles la razón, aunque la decisión de IBM quizá tenga tanto valor como el esfuerzo de Apache en Harmony. Queda por ver si la voracidad no es enemiga de los buenos negocios.

miércoles, octubre 13, 2010

Nota a los "Apuntes sobre Factorías de Software..."

Observo que en la recopilación bibliográfica faltan referencias a Jack Greenfield, Keith Short y sus trabajos sobre factorías de software (Software Factories: Assembling Applications with Patterns, Models, Frameworks, and Tools, y otros documentos).  Una omisión involuntaria, básicamente porque sobre este tema se ha escrito mucho aquí mismo (1, 2). Solucionaremos su ausencia.

LinkedIn mejora sus herramientas

 LinkedIn continúa mejorando su funcionalidad. Ahora inaugura su LinkedIn Labs, haciendo públicos cuatro proyectos iniciales (NewIn 2.0, ChromeIn, Instant Search, y Signal). Este último tiene quinientas invitaciones para experimentar.
Signal: LinkedIn Signal is technology that we recently unveiled at TechCrunch Disrupt 2010. Discover the power of faceted search over both the LinkedIn stream and tweets shared by our users, and find insights you won’t find anywhere else. John Wang, one of the lead engineers, shares some details about the technology behind Signal that includes JRuby, Scala, Voldemort, Zoie & Bobo.

martes, octubre 12, 2010

Apuntes sobre Factorías de Software,III

Quisiera cerrar un comentario de hace un año sobre factorías de software. El tema surgió, como se dice en la primera nota, de un incidente ya superado con la definición del concepto en Wikipedia. Aquel incidente inicial me motivó a conversar con algunos colegas en el interés de aportar una definición más adecuada que la que en un momento tuvo. De esas conversaciones surgió un borrador que nunca llegó a integrarse a Wikipedia, dado que el problema que lo motivara dejó de existir. Sin embargo, el borrador quedó, y estas sucesivas notas publican lo que todavía pudiera ser de interés.
Entonces, para cerrar ese documento, se despliegan ahora los dos o tres puntos de algún interés y de los que aún no se haya hablado...
Cusumano sobre la industria japonesa
Dice Cusumano sobre la industria japonesa del software, para el período 1970/90, reconociendo diferencias entre procesos a los que se puede aplicar un estilo de factoría de software, y aquellos donde no es adaptable:
As for the future of Japanese-style factories as a way of organizing software development (and perhaps other types of design and engineering work), it remained possible that large centralized factory organizations represented a transitional stage in the evolution of Japan's approach to managing software development technology.
Between the late 1960s and the early 1990s, the factory initiatives provided a useful mechanism to centralize, study, and manage a series of projects more efficiently than treating each effort as separate, with scale economies restricted to individual jobs and no scope economies systematically exploited. With improvements in electronic communications technology, it was no longer essential to concentrate large numbers of people in single locations, as seen in the NEC and Fujitsu cases, although Hitachi and Toshiba managers clearly preferred to bring people together, and it seemed more likely that factory-like organizations would continue to co-exist with job shops, depending on the kind of software being built as well as company objectives.
In general, factory strategies appeared best suited for software systems that could rely on reusable designs and components as well as common development tools and techniques. For totally new or innovative design efforts, Japanese software producers tended to utilize less structured organizational forms, such as special projects, laboratories, or subsidiaries and subcontractors, that gave individual engineers more freedom to invent and innovate. To the extent that Japanese firms wished to place more emphasis on individual creativity, they were likely to build more software in non-factory environments as well as emphasizethe design and engineering roles of personnel in internal projects, especially since new engineering recruits in Japan seemed to prefer labels other than "factory" to describe their places of work.

(Cusumano, Shifting Economies: From Craft Production to Flexible Systems and Software Factories, pag 46)

Reflexiones de Aaen,  Bøtcher y Mathiassen sobre flexibilidad
Del trabajo de Ivan Aaen, Peter Bøtcher y Lars Mathiassen (Software Factories), mencionado al inicio de esta serie, quisiera destacar el enfoque abierto de su investigación:
The term factory signals a commitment to long-term, integrated efforts—above the level of individual projects—to enhance software operations. This is not only a powerful, but also a necessary idea taking the challenges involved in professionalizing software operations into account. But for many the term factory has at the same time the controversial connotation that software development and maintenance is comparable to mass-production of industrial products, and arguably this is not the case. This can easily lead to illusions with respect to the kinds of interventions that can, in fact, improve software operations. It is not surprising, therefore, that some software professionals like the concept while others do not.
The term factory can be used to denote either one or more buildings with facilities for manufacturing or the seat of some kind of production. To many people the concept of a factory also implies a particular way of organizing work with considerable job specialization,formalization of behavior and standardization of work processes.
In this paper we will not adopt this historically based connotation. Rather we use the term without assumptions regarding particular ways to standardize, formalize, specialize, or achieve functional
grouping. The factory is an organization inhabited by people engaged in a common effort, work is organized one way or the other, standardization is used for coordination and formalization, and systematization is important, but there will be several options for the design of a particular software factory. This paper investigates how existing approaches to the software factory has chosen among such options by fitting each approach into one of Henry Mintzberg’s five basic organizational structures (Mintzberg, Structures in Fives: Designing Effective Organizations, Prentice-Hall 1983): the simple structure (organic, centralized, direct supervision); the ad-hocracy (organic, decentralized, mutual adjustment); the machine bureaucracy (bureaucratic, centralized, standardized processes); the professional bureaucracy (bureaucratic, decentralized, standardized skills); and the divisionalized form (decomposed based on standardized output).
(...)The goal is to clarify useful contributions and possible illusions related to the idea of a software factory.

(... ) We agree with Cusumano that the challenge for software management is to find ways to “improve organizational skills—not just in one project but across a stream of projects. To accomplish this, however, and still meet the demands of customers, competitors, and the technology itself, requires firms to balance two seemingly contradictory ends: efficiency and flexibility” (Cusumano 1991, p. 5).
The inherent complexities involved in developing and maintaining software suggest that the appropriate organizational form for a software operation is the professional bureaucracy in which professional competence is viewed as more important than standardized procedures and advanced technologies. Software managers are therefore advised to view the professional bureaucracy as the ideal and dominant form while elements of the machine bureaucracy and other organizational forms (Mintzberg 1983) should be treated as supplements to cope with variations, to increase efficiency whenever industrialized procedures are feasible, and to allow for greater flexibility in unique situations where existing procedures and experiences are insufficient. For this reason any long-term management commitment to improve software operations should fundamentally be based on approaches focusing on software processes, (...) and view approaches focusing on infrastructure as supplementary strategies that can help develop environments in which processes and professionals are better supported.
Algunas líneas sobre fuentes reconocidas
Dos fuentes generalmente reconocidas sobre el tema, y con toda razón, son Michael Cusumano y Yoshihiro Matsumoto, citados y analizados por prácticamente todos los autores leídos. Luego, Michael Evans y Herbert Weber, todos ellos a fines de los ochenta o comienzos de los 90. Probablemente influídos por los estudios de estos autores, Hitachi, Toshiba, Fujitsu, el proyecto Eureka y el proyecto Thales son algunos de los más mencionados y analizados. Desde finales de los noventa, es CMM el conjunto de recomendaciones más analizado. En los últimos años se advierte una evolución de estos trabajos a investigaciones orientadas a desarrollo basado en modelos y a los conceptos de Lineas de Producto Software (SPL), que en general no se han comentado en estos apuntes, aunque sí en infinidad de otros.

En cuanto a obras básicas sobre Factorías, estas son algunas de ellas:

Cusumano, M. A. (1989): The Software Factory: A Historical Interpretation. IEEE Software, Marzo.
Cusumano, M. A. (1991): Japan’s Software Factories. Oxford University Press.
Matsumoto, Y. (1981): SWB System: A Software Factory. En H. Hunke (Ed.): Software-Engineering Environments. Amsterdam: North-Holland.
Matsumoto, Y. (1987): A Software Factory: An Overall Approach to Software Production, En P. Freeman (Ed.): Software Reusability, IEEE.
Matsumoto, Yoshihiro y Ohno, Yutaka, “Japanese Perspectives in Software Engineering”, Addison-
Wesley Publishing Company, 1989.
M. Paulk, B. Curtis, M. Chrissis and C. Weber, Capability Maturity Model for Software (Version 1.1),
Technical Report, CMU/SEI-93-TR-024, Pittsburgh, Software Engineering Institute, Carnegie Mellon
University, Febrero, 1993.
Weber, Herbert, (editor), “The Software Factory Challenge - Results of the Eureka Software Factory
Project”, IOS Press, 1997.
Michael W. Evans, The software factory: a fourth generation software engineering environment,John Wiley & Sons.
Un antecedente que en su momento apuntó Pedro Molina correctamente, es Parnas:
D. Parnas: On the Design and Development of Program Families. IEEE Transactions on Software Engineering, March 1976, antecedente también de SPL.

Algunos papeles leídos en el curso de esta recopilación, han sido:
Concepto y Evolución de las Fabricas Software, Javier Garzás, Mario Piattini
Software Factories, de Ivan Aaen, Peter Bøtcher y Lars Mathiassen.
Improving MDD Productivity with Software Factories, de Benoît Langlois, Jean Barata, y Daniel Exertier
Making the Software Factory Work: Lessons from a Decade of Experience, de Harvey P. Siy, James D. Herbsleb, Audris Mockus, Mayuram Krishnan y George T. Tucker
Diffusing Software-based Technologies with a Software Factory Approach for Software Development, A Theoretical Framework, de Lim, Ngang-Kwang, Ang, S. K. James, y Pavri, F.N.
Otros de Cusumano y Matsumoto han sido mencionados antes.

lunes, octubre 04, 2010

¿Dónde acabarán los reclamos de patentes?

A propósito de las demandas de Oracle y Microsoft contra Google y/o Motorola, podemos pensar que mañana Bell podría denunciar a todas las compañias telefónicas por usar "un sistema que transmite la voz por medios electrónicos", sin pagarles patentes...
¿Qué podríamos patentar? Luego, solo se trata de tener el abogado adecuado...En Infobae, pero también en otros medios:
Las denuncias contra el sistema operativo Android, de Google, se suman, ahora desde Microsoft, que acusó a Motorola por la violación de algunas patentes en sus teléfonos relacionadas con los avances que fue sufriendo cada versión del entorno creado por el gigante de Internet.
Microsoft acusó a Motorola ante la Comisión de Comercio Internacional (ITC, por sus siglas en inglés) y ante un tribunal de Washington por la violación de varias patentes en sus modelos que funcionan con Android. Es decir, los teléfonos de la familia Motorola Droid, que en Europa fueron redefinidos como Milestone y Dext.
La explicación llegó por parte del vicepresidente de Microsoft para asuntos corporativos, Horacio Gutiérrez.
En una nota publicada en uno de los blogs de la compañía acotó las demandas a aquellas "relacionadas con desarrollos en marcha en el campo de la telefonía inteligente".
Concretamente, con las funciones que los hacen "inteligentes", añade Gutiérrez, como la posibilidad de sincronizar cuentas de correo electrónico en tiempo real, Exchage ActiveSync.
Otra de los reclamos hace referencia al uso diario de estos modelos Android como gestores de agenda: citas, calendario, listas de contactos, etc.
En cuanto a los avances técnicos, Microsoft reclama a Motorola sobre las patentes para la gestión del espacio de almacenamiento y los indicadores sobre la señal de cobertura.
Tras distanciarse de Motorola, Microsoft llegó a un acuerdo con HTC para licenciar el uso de algunas de sus patentes sobre telefonía móvil en los Android del fabricante, una de las compañías que más beneficio ha sacado al sistema operativo móvil de Google. Sin embargo los términos del acuerdo no fueron publicados por lo que no es posible saber si hay algún tipo de incompatibilidades a este respecto.
Otra de las razones para impedir la evolución de Android está en la competencia directa que ejercerá contra su nuevo sistema operativo, Windows Phone 7, que aspira a ganar cuota del mercado de los smartphones, aumentando la satisfacción el cliente mediante la calidad.
El próximo 11 de octubre tendrá lugar la presentación oficial en Nueva York, previa al lanzamiento del teléfono días después en Norteamérica y los países más grandes de Europa.

sábado, octubre 02, 2010

ECIS, un nuevo reclamo contra las negociaciones de ACTA

Leído primero en Barrapunto: Las negociaciones del ACTA, tenazmente, siguen adelante. Ahora con un cuestionamiento abierto de varios de los grandes actores en la industria del software (tomado de NacionRed):

Una poderosa coalición empresarial liderada por Google Europa, la European Committee for Interoperable Systems (ECIS), en la que están representados también Adobe Systems, IBM, Oracle, Opera o Nokia, ha expresado su preocupación por la deriva negociadora del ACTA, que podía ser firmado en Tokio esta misma semana.
ECIS es una asociación fundada en 1989 para promover la interoperabilidad y la libre competencia, así como un entorno favorable para las soluciones TIC en Europa. Esto le ha llevado a estar en guerra permanente con Microsoft.
En concreto los socios de ECIS muestran su rechazo por la regresión que supone el Acuerdo Anti-Falsificación (ACTA) en relación con la ya de por si muy restricitva Directiva 2001/29/CE, en la que al menos se alcanzó el acuerdo, gracias al Parlamento Europeo, como reconocen textualmente los miembros de ECIS en el comunicado (pdf), para permitir eludir las medidas tecnológicas de protección de los programas de ordenador que permitan adecuar la interoperabilidad (con otros programas) y prevenir que poderosos jugadores obstaculicen la competencia y la innovación.

Pero ahora Google y el resto de empresas que forman parte de ECIS denuncian que ACTA no hace está excepción “Ninguna disposición del presente borrador del ACTA indica que tales disposiciones no se aplican a los programas de ordenador”.

Patentes de software

ACTA extiende el sistema de patentes de EE.UU al resto del mundo. Google y sus socios en ECIS ponen el grito en el cielo ya que se criminaliza la “infracción de patentes de software” con condena de cárcel incluidas ya en el borrador ACTA. Una nueva Inquisición mundial en la que los desarrolladores de software serán disuadidos para no tomar riesgos y continuar con el magnífico trabajo que están desarrollando en Europa.
ECIS denuncian con contundencia las gravísimas consecuencias de esta aspecto de ACTA:
El impacto sería particularmente grave en el ámbito de la interoperabilidad de software, donde la falta de una clara excepción sobre la interoperabilidad de las patentes permite a a los titulares de patentes amenazar a los desarrolladores alternativos.
La imposición de sanciones penales a los desarrolladores podrían agravar el daño aún más. Las consecuencias serían potencialmente aún más drásticas para los desarrolladores de código abierto en Europa que por miedo deberán abandona


Por último, ECIS saca los colores a la Unión Europea y le tiene que recordar a los políticos la importancia de la transparencia en las negociaciones sobre un acuerdo comercial con tantas repercusiones para la vida de los ciudadanos de todo el mundo y la propia actividad de las empresas.
La Comisión Europea y el Consejo deberían garantizar una mayor transparencia en las negociaciones sobre el ACTA. El Tratado de la Unión Europea (en lo sucesivo TFUE) reconoce que “con el fin de promover la buena gobernanza y garantizar la participación de la sociedad civil, las instituciones, organismos, oficinas los organismos de la Unión actuarán con la mayor apertura posible. Tomamos nota de que las negociaciones no se han realizado de conformidad con estas normas de transparencia
ECIS pide a la Comisión Europea y al Consejo de la Unión Europea que todas las partes sean escuchadas en este momento crucial de la negociación.
El comunicado completo de ECIS.
El último borrador conocido de ACTA.
El comentario de Enrique Dans.

Diseño dirigido por modelos, en tránsito

Jean-Jacques Dubray publica en InfoQ hace ya algunos días, algunas reflexiones sobre el desarrollo basado en modelos, indagando sobre su estado actual, y sus posibilidades de ganar terreno, o de sus dificultades para hacerlo. Su nota fue seguida por un buen número de comentarios, que dan también una idea aproximada del estado de su uso:

Ulrik Eklund published a summary of the keynote speech from Jon Whittle at the SPLC 2010 conference. In his talk, he presented some findings on experiences from using model-based development in industry from the EA-MDE project. The project is interested in understanding the factors that lead to success or failure with model-driven engineering (MDE) to help design the next generation of MDE tools. This question is not new, two years ago, Sven Efftinge, Peter Friese and Jan Köhnlein published an article "MDD best practices and Johan Den Haan, CTO of Mendix, published an article on the how an MDD initiative could fail. Adopting an MDE approach can be quite daunting. Then, Johan concluded his article by:
It’s not my goal to discourage you from starting with model-driven software development. I just wanted to show you the complexity of it and wanted to share some thoughts hopefully pointing you at directions helping you to overcome the complexity of MDE.
Two weeks ago he also published an article on his blog detailing the lessons he learned while building a Model Driven Software Factory and he reiterated the same skepticism:
I see Model Driven Software Development as an important part of the future of software development. However, I also see a lot of people struggle with actually using Model-Driven techniques and applying them in their daily business. It isn't trivial to build a successful Model-Driven Software factory (MDSF).

The question is actually quite popular. Last week, Marco Bramballi and Stefano Butti also published a presentation on the same topic while applying BPM and MDD to a Large Scale banking scenario with BPMN, WebML and WebRation.
In his talk, Jon provided some key success factors of a successful MDE approach, as he identified as part of his research:
  1. Keep the domains (modelled, I assume) tight and narrow.
  2. Target well known domains.
  3. Put MDD on the critical path (he means that pilot projects never get sufficient attention and resources).
  4. MDD works best form the ground up.
  5. Be careful of gains that are offset elsewhere.
  6. Don't obsess about code generation.
  7. Not everyone can think abstractly.
  8. Most projects fail at scale-up
  9. Match tolls and processes to the way people think, not the other way around
In their articles, Swen, Peter and Jan also warned:
Our most important recommendation for the reader is: be pragmatic. DSLs and code generators can, when used appropriately, be an immensely useful tool. But the focus should always be the problem to be solved. In many cases, it makes sense to describe certain, but not all, aspects using DSLs.
Even though MDE is still evolving quite rapidly, Jon reports that 83% of the EA-MDE survey respondents "consider MDE a good thing".
After over a decade of Model Driven Architecture, Development and Engineering and which itself followed some rich Model Driven Development environments like NeXTStep that emerged in the early late 80s, models are everywhere, yet, in such a small quantity that our industry seems to still be looking for the path that will make Model-Driven approaches mainstream. Are we really in the position of defining "lessons learned" or "best practices" with so few successes reported? What is hampering us? Is it the focus and precisions of the model? Is it lack of tools? standards? Is is the level of abstraction that make it difficult for most people to create a viable software factory? What's your take on it?
A esta invitación final de Jean-Jacques a discutir le siguieron muchas e interesantes respuestas. Siguen algunas de ellas:
Rui Curado, autor de su propia herramienta, destaca algo que comparto:
Existing model-driven approaches were conceived by "top-notch" developers/mathematicians. This lead to the assumption that those who would use such approaches are "top-notch" developers too. And this is precisely what is happening: Current MDD practitioners belong to the "top 1%".
Most of the world's developer population (myself included) are what I call the "average developer mortal": developers who are good at their job without actually being "geniuses". People who do things the right way, but pragmatically. We are the "everyday programmer". MDD approaches and tools fail to address us
Sobre el enfoque orientado a objetos de las actuales herramientas orientadas a MDD, dice Dean Wampler:
[...] all the MDD approaches that I know of are based on objects. Unfortunately, objects have proven to be poor at defining reusable component standards. That's a controversial statement, but to prove it, ask yourself why the world's most successful "component" standards, like TCP/IP, HTTP, busses in digital electronics, etc. are not object oriented, while OO-based component standards, like CORBA, proved to be too complex to survive the test of time and to achieve ubiquitous, permanent adoption. (Still not convinced? "Design Rules, Vol. 1: The Power of Modularity" discusses the characteristics of successful modularity standards. Very few OO APIs meet those criteria.)
Dice Steven Kelly, sobre las condiciones para elaborar un modelo satisfactorio:
The key issues are:
1) use a language that is highly targeted, efficient and precise at describing the systems you want to build (and thus useless for most other systems)
2) generate production quality code that you don't need to look at or touch
3) use a toolset that makes creating the language, generators and models efficient, scalable and maintainable.
[...] Failing to reach (or even aim for) those 3 targets is what consigns most MDD attempts to mediocrity or failure. Using a tool or language because it's a "standard" or already in use is hardly likely to result in major productivity increases compared to normal. As Einstein said, insanity is doing the same things as before and expecting different results.
Steven, respondiendo a una pregunta de Dubray, dice, respecto a su afirmación de generación completa de código:
100% code generation is definitely the goal, and the norm is to achieve it. Here, 100% means 100% of the code that is specific to this model. There is of course hand-written code below the generated code: what we call the domain framework, i.e. code specifically written to complement this modeling language and generators, and which doesn't change per model.

In some cases a particular model may need an addition to the domain framework; in others models may contain snippets of code or refer to per-model functions that are hand-coded separately.

If by "custom models" you mean "custom modeling languages", then yes, at least that has been my experience. I'm not sure it's "inherently" true, but I think there are good reasons for it. Have you ever seen a tricky or laborious problem can be turned into something an order of magnitude faster and easier by applying the correct language? If so, you probably understand what I'm talking about. Obviously, in general development those situations were rare, because the right language didn't yet exist. DSM lets you create that "right language" for your needs.
Una vez más, un punto cuestionado es el uso de UML, que es confrontado con DSL (Domain Specific Languages). Por supuesto, Steven Kelly, uno de los gestores de Metaedit, es quien especialmente pone el acento en DSM (Domain Specific Modelling). Dice Kelly:

As Dean Wampler said, "Give me a language and APIs that let me express what I need to do very succinctly!" That of course is the whole idea of DSM, and why it is so different from (say) UML-based MDA. It's just a shame that working "code generation" often gets tarred with the same brush as approaches that have failed.

Of course, we're all constrained by what we have experienced personally, e.g. Dean saying that all MDD he has seen has been based on object-oriented languages (I guess referring to UML?). It's just a fact that by nature humans trust personal experience far more than even statistically significant research.

Dean said "Nothing beats a Turing-complete, textual programming language for expressivity and productivity." That may be true if you have to pick one language to use for every single task for your whole life. But for any given problem space, it's simply not true. Expressivity is the ability to say everything you want, precisely and concisely - which is as good a definition of a Domain-Specific Modeling language as you're likely to get. The consistent measured increase in productivity by a factor of 5-10 with DSM is a proven fact. And when you've actually seen DSM in practice, as opposed to code generation with UML or IDE wizards, it's obvious why it is so much faster throughout.
Mi impresión: Sigue siendo un hecho que MDD es un concepto en construcción y evolución. Tras años de acento en UML como herramienta básica de elaboración de modelos, la no obtención de resultados robustos ha llevado a un punto en que se exploran otras alternativas. Probablemente, esa robustez llegará, y las variantes actuales serán sólo una parte de las herramientas que sobrevivan. Lo que no sucederá, es que el desarrollo vuelva atrás a herramientas que no impliquen generación de código y un grado alto de abstracción.