viernes, noviembre 30, 2007

India 2.0

En ZDNet (UK), Adrian Bridgwater escribe sobre la creciente evolución india del outsourcing a innovador tecnológico. Pasada una fase inicial de oferta de trabajo de bajo valor, no sólo despuntan grandes competidores en mercados de desarrollo de software para mercados específicos o consultoría, sino que India trata de lograr un lugar en primera línea, innovando. Bridgwater habla de un cambio fundamental:
(...) "Innovation was always there, but the right conditions, ecosystem and critical mass did not exist from 1947 until the early 1990s. Now, venture capitalists are more willing to venture forth in emerging markets and back entrepreneurs — and India's universities have helped create critical mass in terms of skilled workers. So, as these factors start to coalesce, innovation and entrepreneurship are now much more evident in many different parts of India," said Kamla Bhatt, acclaimed Indian podcaster and presenter of The Kamla Bhatt Show.
(...) PricewaterhouseCoopers estimates that India has roughly two thirds of the global business-process outsourcing (BPO) market, with a value in excess of $7bn (£3.4bn) last year. However, it appears the country has ambitions beyond outsourcing; the Indian technology sector seems to be changing as it develops a new self-belief. Google's labs in Bangalore conceived the initial engineering for the company's Google Finance offering in less than 18 months. This type of development has created the momentum for venture capitalists to propel further developments and invest in an increasingly skilled workforce.
Un aspecto clave, el acento puesto de manera continuada en la formación de profesionales calificados en nuevas tecnologías:
The Indian Institute of Technology is widely regarded as the sub-continent's premier technology school; its seven locations churn out many of the ultra-keen software engineers that are starting to make headlines. Commentators claim there are visible signs that a shift towards higher-value work is occurring.
(...) "BEA has seen its training volumes in India rise from hundreds to thousands of students a year. Big systems-integration firms in many of India's industrialised cities now demand tailored courses and even train over the web, live linked to instructors in the US. The latest trend is for these SIs to become authorised trainers in their own right, so they can train in BEA's technologies to their own timetable in-house," said David Toso, senior vice president of BEA EMEA services.
(...) "India is certainly witnessing a secondary stage in the economic growth it derives from its technology sector, as it channels its workforce towards home-grown projects targeted at a global market," said Bruce Carney, head of developer programmes and services for Symbian. "In the past month, two leading Indian universities have joined our Academy programme and this type of knowledge-base expansion has created the momentum for venture capitalists to propel further developments and invest in an increasingly skilled workforce."
Bridgwater menciona General Motors como uno de los casos de mayor acento en el desarrollo innovativo:
"We are starting to see the creation of technology as a direct revenue generator — not merely as an enabler for making some service delivery faster, better or cheaper. Much of this is still driven by global organisations that originally set up captives to exploit the cost advantages of the Indian skills market, but [which] have graduated to becoming a strategic part of the global technology-development capability of these organisations — truly contributing to the creation of their products and technologies, including product management. For example General Motors R&D in India is developing next-generation electronics and materials for cars of the future," added Stones.
No hay duda que India logrará un lugar de importancia en base a su esfuerzo. Como otros casos, demuestra que es posible crecer aunque se haya partido desde posiciones de menor valor económico. Se trata de entender cómo...

domingo, noviembre 25, 2007

Software Factories: Definición redefinida

En estos días estoy reescribiendo, en cualquier hueco de tiempo, algunos conceptos de La Cuarta Generación que son obsoletos o poco claros. Entrando en ello, veo que Jezz Santos, en Octubre de este año, escribe un utilísimo artículo explicando el alcance de las factorías de software en versión Microsoft, que ponen las cosas en su lugar. Lo adelanto, y espero poder volver sobre el tema, aquí y en "La Cuarta...".
Quiero destacar una llana confesión de Jezz:
Although a software factory sounds like a new type of development tool which may have its own integrated development environment (IDE), in actual fact, Microsoft software factories are intended to extend and configure the general IDE of Visual Studio .NET.
Ahora, si volviera sobre toda la hojarasca teórica acerca del UML y las servilletas de papel, creo que al menos podríamos decir que en esa discusión hay al menos un problema de descoordinación de dominio: Jezz está hablando de una extensión a un ambiente de desarrollo dedicado a un marco específico, .NET, y UML está en un nivel de generalización teórica superior.
Un poco más de Jezz:
First and foremost a software factory is a software development tool. Built by developers, primarily for developers, architects, and others roles in the software development life cycle, such as designers, testers, business analysts project managers. This ‘tool’ is used to manage and automate the assembly and/or configuration of a software solution that addresses a well-known, specific business/problem domain, and is primarily used to create an executable solution from that.
How does a software factory differ from other general development tools and technologies we use today, such as C#, VB, and Visual Studio? A software factory is a specific domain focused tool, with a specific set of instructions, targeted at solving a small specific business/problem domain. Whereas, a general development tool like C# or VB is used to build basically anything we want (within limits of course) it’s a relatively unspecific tool, used to solve any business/problem domain. The point here is that a general development tool can be used to build anything, and therefore it’s left to you to manage the bounds of creating that thing (it is boundless). A software factory, through the use of a software factory schema, tooling and a managed runtime actually instructs and guides you through a known process for building the thing it knows how to build. It is very much constrained to the domain it was designed to address, and not much more.
(...)
This idea is really nothing new today in the Visual Studio IDE where we already use many existing abstractions to increase productivity in developing solutions: the Windows Forms Designer, Windows Workflow activity diagrams, XML schema editors, SQL, configuration files, among many others. What is new though, is providing custom dedicated abstractions tailored specifically to the product or solution we want to help build.
Volveremos sobre ésto. El artículo de Jezz es de real interés.

viernes, noviembre 23, 2007

Las mayores empresas de software europeo

Lo acabo de encontrar en el blog de Juan Palacio: el ranking de las primeras cien empresas de software de Europa. Si a Juan le apena no encontrar allí empresas españolas, a mí me sorprende. Hubiera pensado que al menos alguna de las conocidas daba la talla para entrar allí. Con tiempo revisaré los números. Quizá dependan de cómo se desagreguen. ¿ninguna que alcance los 20 millones de euros?. Buscando un poco, quizá las cifras de Meta4 (es española hoy?)...; Indra factura servicios...¿Panda?, no sé. Sin embargo, las cifras parecen bajas. Una nota que busca cifras, no pasa de cinco millones. Es cierto que lo que conozco es en general gestión de servicios, manejo de recursos humanos, pero no mucho desarrollo de algún tipo de software. Sin embargo, también muchas pequeñas empresas innovadoras, incluyendo el área que me interesa, generadores de código. Mucho para pensar, aunque otros deben ser quienes lo hagan.

domingo, noviembre 18, 2007

Adios a ASC

Esto es más un recuerdo personal que una noticia de importancia. Leyendo en System I Network, encontré una publicidad sobre Abstract que no venía de ASC, sino de Help/Systems. Leyéndola, encontré al menos también a SEQUEL, por lo que entendí que ASC había sido comprada, o había vendido sus líneas de producto. Tuve que buscar un poco hasta encontrar la noticia de su venta durante octubre de 2006. ASC fue una de las líneas de producto que la empresa en que trabajaba representaba hasta su desaparición. La crisis económica argentina se llevó a Oryon, y probablemente el cambio global de los negocios se ha llevado a ASC. ASC representaba un modelo de negocios que parece que fue arrasado en un momento: una pequeña empresa dedicada a un nicho de mercado, los utilitarios de AS/400 (perdón, ISeries, System I, o el próximo nombre que se le dé). Unos productos que alcanzaron su lugar, y que se hicieron un nombre (Abstract fue muy útil para analizar problemas en la crisis del año 2000, SEQUEL sigue siendo una vía guiada poderosa para consultas tipo SQL). El fin de la burbuja en Estados Unidos implicó una concentración de empresas en menos manos que todavía no ha cesado. En su caso, tres elementos seguramente participaron de su fin: estar en un solo mercado, seguir el modelo de precios de IBM, y, como consecuencia del primero, no tener alternativas ante una declinación del mercado del ISeries. ¿Será así? Quizá pueda averiguar algo de primera mano en algún momento. ¿ISeries en declinación? Al menos, no participando del mercado de la manera que pudiera. ¿Sabe qué hacer IBM con el producto? Muchos se preguntan ésto, pero este es otro asunto.

lunes, noviembre 12, 2007

Cognos será azul

EbizQ informa que IBM se propone comprar Cognos, una de las últimas grandes empresas dedicadas a Business Intelligence, por un precio aproximado de 4.900 millones de dólares. Luego de Hyperion y Business Objects, casi todos los competidores del negocio pasan a ser parte de grandes operadores corporativos: Oracle, SAP, IBM. La noticia en EbizQ:

Today IBM announced its intention to acquire Cognos in an all-cash transaction at a price of approximately $5 billion USD or $58 USD per share, with a net transaction value of $4.9 billion USD. The acquisition is subject to Cognos shareholder approval, regulatory approvals and other customary closing conditions, and is expected to close in the first quarter of 2008.
Steve Mills introduced the analyst briefing by talking about the state of the market, with companies becoming more sophisticated about leveraging information, and using analytics for both looking back, and looking forward, becoming more pre-emptive in their decision making. He also spoke about how real-time business analytics was becoming an important part of business process management.
Just about every analyst on the call would have to agree with those statements. Last June, in the ebizQ BI in Action virtual conference this was discussed extensively both in the panel session and in a series of pre-conference podcasts. Indeed, one of the analysts questions was that we’ve been expecting this for a while, why now (answer – a $5 billion purchase takes time).
Cognos fits very well into the IBM stack. For a change it pretty much adds new functionality without adding a lot of redundancy. Furthermore, about 5 years ago Cognos re-architected their solution as a service based offering. It runs on top of IBM (and other) infrastructure software, providing real-time business intelligence and business performance management. The business performance management capabilities are key It enables companies to align, monitor and measure business operations with business strategies. This is truly good stuff – very important to business managers.

Como refiere el artículo de Beth Gold-Bernstein, la visión de IBM es muy optimista:
"Customers are demanding complete solutions, not piece parts, to enable real-time decision making," said Steve Mills, senior vice president and group executive, IBM Software Group. "IBM has been providing Business Intelligence solutions for decades. Our broad set of capabilities – from data warehousing to information integration and analytics – together with Cognos, position us well for the changing Business Intelligence and Performance Management industry. We chose Cognos because of its industry-leading technology that is based on open standards, which complements IBM's Service Oriented Architecture strategy."
(...) “This is an exciting combination for our customers, partners, and employees. It provides us with the ability to expand our vision as the leading BI and Performance Management provider,” said Rob Ashe, president and chief executive officer, Cognos. “IBM is a perfect complement to our strategy, with minimal overlap in products, a broad range of technology synergies, and the resources, reach, and world-class services to accelerate this vision. Furthermore, this combination allows Cognos customers to leverage a broader set of solutions from IBM to advance their information management driven initiatives.”
Me olvido de alguna? Sólo queda Informatica Corporation, entre las grandes. El negocio de ERP e inteligencia de negocios cada vez más concentrado.

domingo, noviembre 11, 2007

Steven Kelly nuevamente sobre Software Factories

Durante el mes de octubre, Steven Kelly publicó un par de notas en su blog que comparto en más de un aspecto. De una ya hemos hablado; la otra fue motivada por un artículo de Jack Greenfield en Methods & Tools, y retoma la discusión iniciada comentando a Jezz Santos: qué es primero, el problema o la solución.
El artículo de Greenfield es más explicativo que otros sobre su esquema, aporta algunos elementos de interés, pero también refuerza las dudas sobre el valor de su iniciativa. Pensaba pormenorizar algunos de sus conceptos, pero Kelly lo hace con claridad. Recomiendo leer su opinión, siguiendo el conjunto de la discusión, que incorpora referencias a Jezz Santos y Juha-Pekka, que completan la idea.
Jack does hint at more specialized factories than the earlier very generic "horizontal" factories he has tended to talk about. But the level of abstraction is not being raised much above how people currently code:
Known good patterns and practices in the target domain are harvested, refined and encoded into domain specific tools and runtimes. The tools are then used to rapidly select, adapt, configure, complete, assemble and generate solutions that use the runtimes.

Let's do a thought experiment. Imagine yourself back in time to when applications were built with assembly language. Which of the words that I've italicized above would indicate a radical shift upwards in the level of abstraction? E.g. if you can select among some existing chunks of assembly language -- nice maybe, but you're still working on the same level: you've not moved to third generation languages yet. Only the final verb, generate, accomplishes that change.

In the same way, insofar as Jack's article is a good description of Software Factories, it looks like their emphasis is more on small percentage improvements of existing ways of building software. That's a shame, especially given that earlier they seemed more focused on the DSL and generation elements that they share with Domain-Specific Modeling. The $64,000 question is: why this change of emphasis? Jack, Jezz, Prashant Sridharan and other MS people have all made comments along the lines that doing real problem-domain-based DSM has proven too hard for them. Why are they failing, when so many others are succeeding? For examples of success, just take a look at the articles from the upcoming 7th OOPSLA workshop on Domain-Specific Modeling (e.g. 24, 14 and 10 are all graphical DSL examples).

jueves, noviembre 08, 2007

La red en el futuro

Que la creciente importancia de la web está cambiando ampliamente la construcción y utilización del software, es indudable. Cambian arquitecturas, posibilidades, instrumentos, recursos, y participantes. El grupo de tecnología de Wharton comenta uno de los aspectos que prefiguran claramente cambios en la ubicuidad y movilidad del uso del software: la integración de las aplicaciones de escritorio y la web. En su artículo sin firma, Wharton balancea a los principales contendientes y los distintos puntos de vista para lograr este objetivo, y también juega una opinión.
On October 1, Adobe Systems announced an agreement to buy Virtual Ubiquity, a company that has created a web-based word processor built on Adobe's next generation software development platform. One day earlier, Microsoft outlined its plans for Microsoft Office Live Workspace, a service that will combine Microsoft Office and web capabilitiesso that documents can be shared online. Recently,Google introduced a technology called "Gears" that allows developers to create web applications that can also work offline. The common thread between the recent moves of these technology titans: Each company is placing a bet on a new vision of software's future, one which combines the features of web-based applications with desktop software to create a hybrid model that may offer the best of both worlds.
Wharton ve dos estilos, uno que viene de una historia basada en el escritorio, y otra que nació en la Web, de tal forma que el acento está puesto en la red o en la estación local (desktop/Webtop). Su idea es que ninguno de los dos es suficiente:
But as this drive toward hybrid desktop/webtop software illustrates, there are limits to both approaches, and the future for software may be a blend of the best features of both.
(...) The most likely outcome is a hybrid future where desktop and web-based software and services become intertwined to the point where users won't know the difference between the two, suggest experts at Wharton and elsewhere. "We believe that the future of technology at work will be a combination of local software on PCs, along with services," said Jeff Raikes, president of Microsoft's Business Division, in a question and answer session at the announcement of Office Live Workspace on September 30. "Think of it as a continuum, ranging from pure software to pure services approaches. Most customers will be somewhere in the middle."
Los profesores Eric Clemons y Kartik Hosanagar, apunta a los aspectos más diferenciadores de esta tendencia:
(...) this model is likely to develop in two phases. "In the first phase, applications will provide essentially the same features as a desktop application, only you will now be able to access them from anywhere. Current web-based apps are good examples of this." For example, Yahoo Mail looks a lot like Microsoft's Outlook email program. Google Docs and Adobe's Buzzword mimic Microsoft Word and add perks like the ability to access your documents from any computer.
In this phase, occurring today, Hosanagar says desktop applications will offer more features than web-based software, but over time that advantage will erode.
In the second phase of this hybrid model, web applications and desktop software will co-mingle, says Hosanagar. "What's likely to be more exciting is the next phase, where these web-based applications can interact and share data with each other and become platforms [that developers can use to build more software]. Facebook has already become one such platform, as has Salesforce.com on the enterprise side. In the next phase, far more interesting things will happen as these web apps start talking to each other."
Clemons notes that another critical factor for the evolution of software will be mobile applications. "My bet is that desktop software will be used for home operations, and webtop software will be used for mobile applications," says Clemons. The key will be synchronizing desktop and web software wherever a person goes. "None of us has a good idea what these mobile applications will be, but they may provide real value."
Krishnan Anand agrega otro elemento, "pay-per-use":
Anand says another model that's likely to emerge is one that is based on usage. In this model, a person who used a program infrequently could employ the web-based version for free or a small fee. Heavy users would pay more based on usage. In this model, which would apply to both web-based and desktop software, Anand likens software providers to electric utilities. "The notion is you can charge different prices based on levels of usage," he says.
Sin embargo, la confiabilidad sigue siendo débil:
"Reliability is critical for many of us. Even now, networks crash and I can't access files. I still have to make sure I have a copy on desktop. Until that changes, I don't see an advantage to web-based applications."

domingo, noviembre 04, 2007

Un poco de humor para analizar la Web



InfoQ publica una pequeña nota sobre el trabajo de Paul Downey, quien creó un mapa de la web combinando algo de su historia, sus estándares y otras relaciones, como si fuera el mapa de la Tierra Media del Señor de los Anillos. Siguiendo la nota, aparece otro intento similar, comentado por Amivdh, alojado en W3C. Arriba, los dos "mapas".
La nota de InfoQ:
Paul Downey from BT has produced an adventuring map for the Web in the style of that used by the Fellowship of the Ring, or Bilbo. It includes historical aspects, relationships to standards and other wonderful visualizations. Using a Web 2.0 approach, the original map is augmented with various call-outs to emphasize different aspects, such as "Google's all seeing eye", Mordorsoft and "FUD" in what would be Mordor, the Tower of WS-Babel, which is surrounded by the Swamp of BPEL and the Ruins of CORBA. Although probably starting out as a fun way to visualize the world of SOA, Web Services and the Web in general, it makes some interesting points (such as the reference to the Maelstrom of Incompatibility and the Lost Tribe of UDDI).

Downey dejó disponible un detallado pdf sobre su mapa. El sitio en Flickr también permite un recorrido detallado de cada parte del mapa, en fotografías sucesivas.