sábado, noviembre 14, 2009

Oslo: quitando ambiguedad a las expectativas

Duro papel se le ha reservado a Douglas Purdy: anunciar la defunción del proyecto Oslo, y hacerlo con entusiasmo. Considerando la fugacidad de Internet, y la variabilidad de la información ofrecida por Microsoft sobre sus productos, es hora de conservar una visión de lo que fué Oslo, antes de que su memoria sea rediseñada. He intentado buscar páginas específicas de Microsoft en The Internet Machine, pero parece que sus páginas son difíciles de archivar: ninguna de las buscadas tuvo resultados. Sin embargo, queda la Wikipedia: lo que sigue es lo esencial de la versión todavía no modificada del producto, al día 3 de octubre, fecha del último cambio:


Originally, in 2007, the "Oslo" name encompassed a much broader set of technologies including "updated messaging and workflow technologies in the next version of BizTalk Server and other products" such as the .NET Framework, Microsoft Visual Studio, and Microsoft System Center (specifically the Operations Manager and Configuration Manager).[1]

By September 2008, however, Microsoft changed its plans to redesign BizTalk Server[2] Other pieces of the original "Oslo" group were also broken off and given identities of their own; "Oslo" ceased to be a container for future versions of other products. Instead, it was identified as a set of software development and systems management tools:[2] around "Oslo".

  • A centralized repository for application workflows, message contracts (which describe an application's supported message formats and protocols), and other application components
  • A modeling language to describe workflows, contracts, and other elements stored in the repository
  • A visual editor and other development tools for the modeling language
  • A process server to support deployment and execution of application components from the repository.

When "Oslo" was first presented to the public at the Microsoft Professional Developers Conference in October 2008, this list has been focused even further. The process server was split off as code name "Dublin" that would work with "Oslo", leaving "Oslo" itself composed of the first three components above that are presently described (and rearranged) as follows:[3]

  • A storage runtime (the code name "Oslo" repository, built on Microsoft SQL Server) that is highly optimized to provide your data schemas and instances with system-provided best SQL Server practices for scalability, availability, security, versioning, change tracking, and localization.
  • A configurable visual tool (Microsoft code name "Quadrant") that enables you and your customers to interact with the data schemas and instances in exactly the way that is clearest to you and to them. That is, instead of having to look at data in terms of tables and rows, "Quadrant" allows every user to configure its views to naturally reveal the full richness of the higher-level relationships within that data.
  • A language (Microsoft code name "M") with features that enable you to model (or describe) your data structures, data instances, and data environment (such as storage, security, and versioning) in an interoperable way. It also offers simple yet powerful services to create new languages or transformations that are even more specific to the critical needs of your domain. This allows .NET Framework runtimes and applications to execute more of the described intent of the developer or architect while removing much of the coding and recoding necessary to enable it.

Relationship to "Dynamic IT"

"Oslo" is also presently positioned as a set of modeling technologies for the .NET platform and part of the effort known as Dynamic IT. Bob Muglia, Senior Vice President for Microsoft's Server & Tools Business, has said this about Dynamic IT:[4]

It costs customers too much to maintain their existing systems and it's not easy enough for them to build new solutions. [We're focused] on bringing together a cohesive solution set that enables customers to both reduce their ongoing maintenance costs while at the same time simplifying the cost of new application development so they can apply that directly to their business.

The secret of this is end-to-end thinking, from the beginning of the development cycle all the way through to the deployment and maintenance, and all the way throughout the entire application lifecycle.

One of the pillars of this initiative is an environment that is "model-driven" wherein every critical aspect of the application lifecycle from architecture, design, and development through to deployment, maintenance, and IT infrastructure in general, is described by metadata artifacts (called "models") that are shared by all the roles at each stage in the lifecycle. This differs from the typical approach in which, as Bob Kelly, General Manager of Microsoft's Infrastructure Server Marketing group put it,[5]

[a customer's] IT department and their development environment are two different silos, and the resulting effect of that is that anytime you want to deploy an application or a service, the developer builds it, throws it over the wall to IT, they try to deploy it, it breaks a policy or breaks some configuration, they hand that feedback to the developer, and so on. A very costly [way of doing business].

By focusing on "models"—model-based infrastructure and model-based development—we believe it enables IT to capture their policies in models and also allows the developers to capture configuration (the health of that application) in a model, then you can deploy that in a test environment very easily and very quickly (especially using virtualization). Then having a toolset like System Center that can act on that model and ensure that the application or service stays within tolerance of that model. This reduces the total cost of ownership, makes it much faster to deploy new applications and new services which ultimately drive the business, and allows for a dynamic IT environment.

To be more specific, a problem today is that data that describes an application throughout its lifecycle ends up in multiple different stores. For example:

  • Planning data such as requirements, service-level agreements, and so forth, generally live in documents created by products such as Microsoft Office.
  • Development data such as architecture, source code, and test suites live within a system like Microsoft Visual Studio.
  • ISV data such as rules, processes modes, etc. live within custom data stores.
  • Operation data such as health, policies, service-level agreements, etc., live within a management environment like Microsoft System Center.

Between these, there is little or no data sharing between the tools and runtimes involved. One of the elements of "Oslo" is to concentrate this metadata into the central "Oslo" repository based on SQL Server, thereby making that repository really the hub of Dynamic IT.

Model-Driven Development

"Oslo," then, is that set of tools that make it easier to build more and more of any application purely out of data. That is, "Oslo" aims to have the entire application throughout its entire lifecycle completely described in data/metadata that it contained within a database. As described on "Oslo" Developer's Center:[3]

Model-driven development in the context of "Oslo" indicates a development process that revolves around building applications primarily through metadata. This means moving more of the definition of an application out of the world of code and into the world of data, where the developer's original intent is increasingly transparent to both the platform (and other developers). As data, the application definition can be easily viewed and quickly edited in a variety of forms, and even queried, making all the design and implementation details that much more accessible. As discussed in this topic already, Microsoft technologies have been moving in this direction for many years; things like COM type libraries, .NET Framework metadata attributes, and XAML have all moved increasingly toward declaring one's intentions directly as data—in ways that make sense for your problem domain—and away from encoding them into a lower-level form, such as x86 or .NET intermediate language (IL) instructions. This is what the code name "Oslo" modeling technologies are all about.

The "models" in question aren't anything new: they simply define the structure of the data in a SQL server database. These are the structures with which the "Oslo" tools interact.

Characteristics of the "Oslo" Repository and Domains

From the "Oslo" Developer's Center: [3]

The "Oslo" Repository provides a robust, enterprise-ready storage location for the data models. It takes advantage of the best features of SQL Server 2008 to deliver on critical areas such as scalability, security, and performance. The "Oslo" repository's Base Domain Library (BDL) provides infrastructure and services, simplifying the task of creating and managing enterprise-scale databases. The repository provides the foundation for productively building models and model-driven applications with code name "Oslo" modeling technologies.

"Oslo" also includes additional pre-built "domains," which are pre-defined models and tools for working with particular kinds of data. At present, such domains are included for:[6]

  1. The Common Language Runtime (CLR), which supports extracting metadata from CLR assemblies and storing them in the "Oslo" repository in such a way that they can be explored and queried. A benefit to this domain is that it can maintain such information about the code assets of an entire enterprise, in contrast to tools such as the "Object Explorer" of Microsoft Visual Studio that only works with code assets on a single machine.
  2. Unified Modeling Language (UML), which targets the Object Management Group's Unified Modeling Language™ (UML™) specification version 2.1.2
    . UML 2.1.2 models in the Object Management Group's XML Metadata Interchange (XMI) version 2.1
    file format can be imported into the code name "Oslo" repository with a loader tool included with "Oslo".

Note that while the "Oslo" repository is part of the toolset, models may be deployed into any arbitrary SQL Server database; the "Quadrant" tool is also capable of working with arbitrary SQL Server databases.

Characteristics of the "M" Modeling Language

According to the "Oslo" Developer's Center, the "M" language and its features are used to define "custom language, schema for data (data models), and data values." [3] The intention is to allow for very domain-specific expression of data and metadata values, thereby increasing efficiency and productivity. A key to "M" is that while it allows for making statements "about the structure, constraints, and relationships, but says nothing about how the data is stored or accessed, or about what specific values an instance might contain. By default, 'M' models are stored in the 'Oslo' repository, but you are free to modify the output to any storage or access format. If you are familiar with XML, the schema definition feature is like XSD." [3] The "M" language and its associated tools also simplify the creation of custom domain-specific languages (DSLs) by providing a generic infrastructure engine (parser, lexer, and compiler) that's configured with a specific "grammar". Developers have found many uses for such easy-to-define customer languages.[7]

Recognizing the widespread interest in the ongoing development of the language, Microsoft shifted that development in March 2009 to a public group of individuals and organizations called the "M" Specification Community.

Characteristics of the "Quadrant" Model Editor

"Oslo's" model editor, known as "Quadrant," is intended to be a new kind of graphical tool for editing and exploring data in any SQL Server database. As described on the "Oslo" Developer's Center: [3]

A user can open multiple windows (called workpads) in "Quadrant". Each workpad can contain a connection to a different database, or a different view of the same database. Each workpad also includes a query box in which users can modify the data to a single record or a set of records that match the query criteria.

"Quadrant" features a different way of visualizing data: along with simple list views and table views, data can be displayed in a tree view, in a properties view, and in variable combinations of these four basic views. An essential part of this is the ability to dynamically switch, at any time, between the simplest and the most complex views of the data. As you explore data with these views, insights and connections between data sets previously unknown may become apparent. And that has benefits for those using the Microsoft "Oslo" modeling technologies to create new models. As part of the "Oslo" modeling technologies toolset, "Quadrant" enables "Oslo" developers to view new models with "Quadrant" viewers. The "Quadrant" data viewing experience enables designers of DSLs to quickly visualize the objects that language users will work with. In this way, "Quadrant" will give developers a quick vision of their models. With this feedback, "Quadrant" can also provide a reality check for the model designer, which may in turn lead to better data structures and models.

In the future, Microsoft intends for "Quadrant" to support greater degrees of domain-specific customization, allowing developers to exactly tailor the interaction with data for specific users and roles within an enterprise.

Si usted sigue el enlace de la Wikipedia (recuadro arriba a la derecha, "Code Name Oslo", dirección del website), encontrará que el proyecto Oslo ya no existe como unidad, sino que es conducido al Data Platform Developer Center. Ninguna referencia por allí, nada que nos comunique lo que Douglas Purdy anuncia en su blog, aunque puede ser pronto. Sin embargo, el link ya es redireccionado. Luego veremos...
Todavía hoy, 14 de noviembre, el enlace http://msdn.microsoft.com/en-us/library/cc709420.aspx remite al apartado sobre Oslo en MSDN Library, en su referencia a .NET. Indudablemente todo el contenido deberá sufrir reingeniería. Recorrer su contenido todavía no excesivamente transformado, da una idea de la magnitud de la renuncia, si luego retornamos a la parca redefinición de Purdy:

The components of the SQL Server Modeling CTP are:

  • “M” is a highly productive, developer friendly, textual language for defining schemas, queries, values, functions and DSLs for SQL Server databases
  • “Quadrant” is a customizable tool for interacting with large datasets stored in SQL Server databases
  • “Repository” is a SQL Server role for the the secure sharing of models between applications and systems

We will announce the official names for these components as we land them, but the key thing is that all of these components are now part of SQL Server and will ship with a future release of that product.

No sólo las páginas de MSDN sobre Oslo, o Douglas, deberán ser "refactorizados". Desde junio de 2008, Steve Cook asumió tareas de integración de UML dentro de Visual Studio y paralelamente colaborando con el proyecto Oslo en la integración de UML ¿qué papel jugaba Oslo en este proyecto? ¿cuál será el papel de Steve ahora? Si recorremos las noticias publicadas a través del último año y medio, la impresión que queda es que dos vías de investigación paralelas coexistieron, y que una de ellas al menos ha pasado a vía muerta. Ahora tiene sentido lo que Stuart Kent comentara en noviembre de 2008:

The Oslo modeling platform was announced at Microsoft's PDC and we've been asked by a number of customers what the relationship is between DSL Tools and Oslo. So I thought it would be worth clearing the air on this. Keith Short from the Oslo team has just posted on this very same question. I haven’t much to add really, except to clarify a couple of things about DSL Tools and VSTS Team Architect.

As Keith pointed out, some commentators have suggested that DSL Tools is dead. This couldn’t be further from the truth. Keith himself points out that "both products have a lifecycle in front of them". In DSL Tools in Visual Studio 2010 I summarize the new features that we're shipping for DSL Tools in VS 2010, and we'll be providing more details in future posts. In short, the platform has expanded to support forms-based designers and interaction between models and designers. There's also the new suite of designers from Team Architect including a set of UML designers and technology specific DSLs coming in VS 2010. These have been built using DSL Tools. Cameron has blogged about this, and there are now some great videos describing the features, including some new technology for visualizing existing code and artifacts. See this entry from Steve for details.

The new features in DSL Tools support integration with the designers from team architect, for example with DSLs of your own, using the new modelbus, and we're exploring other ways in which you can enhance and customize those designers without having to taking the step of creating your own DSL. Our T4 text templating technology will also work with these designers for code generation and will allow access to models across the modelbus. You may also be interested in my post Long Time No Blog, UML and DSLs which talks more about the relationship between DSLs and UML.

Pero volviendo a quienes comprometieron sus opiniones en favor de Oslo, ¿cuál es su sensación ahora? Me refiero a opiniones vertidas como en el caso del artículo "Creating Modern Applications: Workflows, Services, and Models", de David Chapell. En un tiempo tan temprano como octubre de 2008, David adelantó las características del proyecto, vendiendo lo que aún era un lineamiento. Todavía después hemos visto como algunos de los elementos adelantados eran dejados aparte, y, siendo todavía un proyecto inmaduro, volvía a ser presentado como una realidad. Y así, hasta el crudo despertar del 10 de noviembre.
Entre otras conclusiones que pueden extraerse de este proyecto ahora aparentemente en proceso de entierro, dos tienen particular interés:
  • No es un buen modelo de negocios el vender como realidades lo que aún son esbozos. El cliente (empresas usuarias finales, comunidad de desarrolladores, consultores independientes, investigadores) saldrá herido, por distintas razones: algunos por postergar decisiones en espera de un producto estrella, otros por comprometer su palabra en favor de algo descartado, y otros por perder tiempo en espera de una herramienta que no fue tomada en serio.
  • Es problemático depositar el desarrollo de investigaciones de avanzada en los planes de mercado de una empresa.

No hay comentarios.: