El camino a recorrer es evidentemente evolutivo: si en efecto Greenfield, Keith Short, Steve Cook y otros apuntaron a aspectos insuficientes de UML, Oslo marca que aún hay camino por recorrer, y que los distintos emprendimientos todavía fallan cada uno en algo. Sobre las políticas comerciales de Microsoft mucho más se podría conversar, pero será otro día.One of the interesting questions about Oslo is its relationship to DSL Tools. Actually, we should say between Oslo and Software Factories (the marketing side), or between M and DSL Tools (the technical side). Technically it seems there is no link -- which means no integration and no upgrade path. On the marketing side, few people seem to have picked up on the fact that Keith Short, co-author of the Software Factories book, moved to work on Oslo nearly two years ago. Steve Cook and Alan Cameron Wills, co- authors of the DSL Tools book, have also left the team, but for UML and MSF respectively.
Of course, people move around, and it's more interesting to hear what people still in those teams say. An Oslo developer writes:
If I look around, I see people doing [declarative, model-driven programming] today in the form of XML schemas and dialects, various textual reps, and frameworks that encode a domain. We went down that path as well, using visual designers and XML. But at some point the pain was too much :) We evolved our approach into Oslo.Microsoft's "visual designers and XML" presumably refers to DSL Tools, and the comment about the pain being too much is perhaps at least one answer to the question of why Oslo isn't being billed as an evolutionary step along the Software Factories / DSL Tools path. It sounds more like Microsoft have concluded that their DSL Tools are an evolutionary dead end, have taken a step back, and are now heading down a different path. That's the impression I get from Keith Short's blog entry: "both Oslo and the DSL Toolkit have grown from a common belief" in DSLs.
Microsoft are of course claiming both products will continue to be developed, but losing 3 out of 6 main figures from the DSL Tools team is hardly encouraging. Mind you, I think what is needed is even more radical: both Oslo and DSL Tools should be put on hold until Microsoft have figured out what you need for an industrial strength language for describing modeling languages. The resulting languages and tools have to scale to multiple simultaneous users, multiple representational paradigms (graphical, textual, matrix, tabular), multiple platforms (not very likely that one!), integration between multiple modeling languages and multiple models, and evolution through multiple versions of the languages. There are a few more multi's I could add (look at slide 15 from my keynote to the OOPSLA DSM Workshop), but you get the picture. And if you want more than just the picture, get the tool!
Comentarios, discusiones, notas, sobre tendencias en el desarrollo de la tecnología informática, y la importancia de la calidad en la construcción de software.
lunes, noviembre 10, 2008
Software Factories + DSLs, en vía muerta?
domingo, noviembre 09, 2008
¿Hay que tomarlo en serio o no?, II
Paradojas pedagógicas, y Boloña, integridad de la academia y las fuerzas del mercado.
Más en el futuro.
Los muertos que vos matáis...
La nota es breve, pero abre dos o tres líneas de discusión. No tengo tiempo ahora pero algo más conversaremos. Sin embargo, persiste la idea de comparar y contraponer lenguajes específicos de dominio con UML, que está a un nivel de abstraccción distinto y superior. Si el caso hubiera sido al revés, con DSLs como única herramienta, seguramente hubiéramos visto la aparición de un UML como un alivo capaz de integrar un conjunto inmanejable de dominios específicos. Quisiera destacar del comentario de Sadek, su resumen de los requisitos explicados por Kelly y Tolvanen para un lenguaje de modelado:
El problema es cómo definir el alcance del modelo. La definición de Kelly/Tolvanen asocia estrechamente un dominio específico con el código generable. Esto atomiza el problema.1) It should map to the domain problem concepts and not implementation details.
2) It has to be formalized and helpful not only for communicating with domain experts but also for development tasks by making it possible to generate from the models executable code, documentation and certain classes of tests, as well as by eliminating the need for implementing some other tests and by raising the level of abstraction for code maintainers.
3) It needs to have stand alone tooling support that would allow domain experts to “organize their frameworks and libraries in such a way that the models can be “compiled” into fully functioning code”, and this without them having to think in terms of code. To illustrate this idea, Lispy uses the analogy of compiling C to machine code, where C is in some way "a modeling language for machine code" and where C programmers don't have to modify the machine code compiled using compilers created by machine language experts.
Persistir en una visión más abstracta, sea a través de UML u otra representación, permite expresar modelos de mayor alcance que un teléfono u otro aparato. Lo que puede ser más que adecuado para sistemas embebidos, no es aplicable para grandes sistemas.
sábado, noviembre 08, 2008
¿Hay que tomarlo en serio o no?
Luego volveremos sobre todo esto.
miércoles, noviembre 05, 2008
Dos o tres pequeños cambios en el blog
domingo, noviembre 02, 2008
Avalancha de información sobre Oslo
El artículo de David Chapell, "Creating Modern Applications: Workflows, Services, and Models", que es el más completo que he visto por ahora.
El podcast de Paul Vick,"Oslo" -- Microsoft's Modeling Platform, a través de Informit.
Dos notas de Paul Gielens, 1 y 2.
La publicación de la especificación de M.
Hay más, realmente bastante. Se abrió la campaña.
El software en tiempos de crisis, II
Infobae resume declaraciones de Miguel Ángel Calello, de CESSI:
En las palabras de Calello no sólo se observan los riesgos específicos de la crisis, sino también alguno propio de la interminable falla argentina, particularmente las dificultades de financiamiento. Un problema compartido con todas las empresas, especialmente si por su tamaño no pueden optar por financiación internacional. Y un problema que la crisis general agrandará.El sector del software y los servicios informáticos (SSI) de la Argentina, una de las actividades de mayor crecimiento luego de la devaluación del peso en 2002, comenzó a sentir los efectos de la crisis financiera internacional. “Ya empezaron a caer las gotas”, ilustró el presidente de la cámara de empresas de la industria, Miguel Calello.
Uno de los principales afectados es la línea de empresas que exportan servicios. Calello estimó que la baja de las exportaciones proyectadas para el sector en 2008, que eran de alrededor de $1.350 millones, oscilará entre el 10 y el 12 por ciento.
En una conferencia de prensa, posterior a la asamblea anual de la entidad, el empresario advirtió que “muchas empresas deberán cambiar su estructura de costos”, como consecuencia de la crisis, para no perder competitividad en los mercados externos.
“Los primeros que se van resintiendo” serán aquellos jugadores que apuestan a una mano de obra barata, como las factorías de software.
Pero la crisis también se refleja en la postergación en la toma de decisiones de compra y en la cancelación de contratos, señaló Calello, y en la parte financiera del negocio, con dificultades en la cobranza.
Igualmente, en la cámara empresaria del SSI esperan con cierta confianza los próximos meses, a la luz de lo ocurrido en 2002, cuando casi todas las compañías afiliadas a la cámara lograron sobrevivir a la devaluación del peso.
Esa experiencia “nos sirve –dijo- para saber cómo manejarnos”. En ese sentido, rescató que “la adaptación al cambio y la creatividad” de las firmas locales ayudarán a generar nuevas oportunidades.
Sin embargo, advirtió que las empresas del SSI no se quedarán de brazos cruzados sino que reforzarán los esquemas de asociatividad para salir a buscar nuevos mercados en el exterior, y además renovarán sus reclamos al Gobierno nacional.
Al respecto, Calello (...) sostuvo que pedirán a la administración de Cristina Kirchner que centralice en una sola área toda la relación del Gobierno con el sector SSI, hoy dispersa entre casi todos los ministerios del Poder Ejecutivo.
También la cámara propondrá una nueva ley para el sector del software, que modifica la norma sancionada en 2005 y que otorgó beneficios impositivos y fiscales por 10 años. Entre los cambios propuestos figura el establecimiento de un plazo de tres años para certificar procesos de calidad, que se contaría a partir del momento que la empresa es autorizada a recibir el beneficio fiscal, e incluir los reintegros del IVA a las compañías que exportan. Según informó la entidad, más de 240 empresas fueron aprobadas para la ley 25.922.
La cámara insistirá, además, con su reclamo para que el Estado considere a las compañías nacionales en sus compras de software. “Si esta industria es clave para el Gobierno, y la que más ha crecido en los últimos años, lo único que falta es que le compren. Tiene que entender que el gasto en tecnología local es una inversión”, advirtió.
Otra necesidad del sector es facilitar el acceso al financiamiento. “Un enorme porcentaje de la inversión se hace a través de capital propio. Tenemos que trabajar en la constitución de capitales de riesgo. Esta industria necesita inversión, no necesita crédito”, explicó Calello.
También apuntó contra las universidades: dijo que las casas de altos estudios “no están para proporcionar servicios que compitan con el sector privado, sino para formar capital humano”, en referencia a los contratos de servicios que varias universidades firmaron con organismos gubernamentales.
En cuanto a la agenda digital que el Gobierno asegura se lanzará antes de fin de año, aclaró: “No es una agenda para la industria, sino que es una contribución del sector para la sociedad. Muchas veces se confunde y se cree que es un plan para la industria, y es importante remarcar que no es así”.
Además, pidió que se cumpla el principio de “neutralidad tecnológica” en las compras oficiales, para defender la propiedad intelectual y la inversión en investigación y desarrollo.
“Defendemos a la empresa local, sin importar su bandera. O sea, a aquella que está afincada en el país y participa del ecoclima de la industria. No queremos defender a las empresas golondrina. A partir de esta crisis, vamos a ver las que vinieron solamente a aprovechar la mano de obra barata”.
Para 2009 la cámara planea crear un centro de capacitación de alto nivel, que no será el mismo tipo de educación que ofrecen los planes mixtos de becas, denominados Ctrl F. El centro privado trabajará para empleados de los asociados o terceros que quieran formarse. Uno de los perfiles que formará será el de los consultores, para que las personas mayores de 40 años puedan reinsertarse al mercado laboral.
Sin duda habrá retraimiento de la demanda internacional, y seguramente el outsourcing de servicios tendrá problemas. Como ya se ha dicho, el software argentino debe aumentar la calidad de su salto al mercado exterior. Y la crisis puede significar pérdida de contratos, pero también nuevas oportunidades, si se sabe buscarlas. Esta es una mala época para los gigantes. Así como hemos visto el fracaso de muchas evaluadoras de riesgo incapaces de advertir el fallo de los grandes bancos (o interesadas en no ver), así también quienes deben "poner las barbas en remojo" en ésta época son los grandes jugadores, aquellos que facturan servicios y productos que no cuadran con una crisis. Parecen acertadas las previsiones de CESSI para mantener competitividad.
No está de más recordar un reciente estudio español sobre las oportunidades de éste momento:
"los Emprendedores son los que menos padecen la crisis al disponer de estructuras muy ajustadas y eficientes, fruto de la gestión eficaz y continua de máxima rentabilidad de los recursos." "Entre los entrevistados, más del 65% revelaron las oportunidades que se presentan en un mercado parado y azorado. Las posibilidades de consolidar el proyecto y hacerlo crecer aumentan con las reestructuraciones de empresas y plantillas" (Tech Sales Group, "Los Emprendedores, los más optimistas ante la crisis")
domingo, octubre 26, 2008
El software en tiempos de crisis
lunes, octubre 20, 2008
Yahoo se sube al tren...
La empresa californiana de internet Yahoo tiene previsto implementar un plan de recorte de gastos que incluye, entre otras medidas, el despido de más de 1.000 trabajadores, informó 'The Wall Street Journal'.
Según el rotativo norteamericano, que cita a una fuente cercana a la operación, el número de despidos podría superar así los 1.000 que fueron anunciados en enero y que suponían el 7 por ciento de la plantilla, que a 30 de junio estaba compuesta por más de 14.000 empleados.
Con este plan, que se anunciará posiblemente mañana, coincidiendo con la presentación de los resultados trimestrales, Yahoo pretende mejorar su situación para hacer frente a la crisis económica y a la dura competencia existente en el sector.
De materializarse el anuncio, se tratará del mayor ajuste de empleo en la historia de Yahoo, superando al ejecutado en 2001 cuando despidió a 660 trabajadores.
La crisis llega al software
La nota:
La nota de Worthen es comentada por Vinnie Mirchandani, que desde hace algún tiempo le viene dedicando espacio a SAP, particularmente por su política de costos de mantenimiento a sus clientes. Que muy probablemente también sufrirá cambios.SAP announced last week that its revenue for the quarter would fall short of its guidance due to a sudden drop in orders at the end of September. That sent the company into cost-cutting mode, as outlined in an email co-CEOs Henning Kagermann and Leo Apotheker sent to staff last week. A copy of the email was obtained by the Business Technology Blog.
The party line in the tech industry is that businesses will keep spending on tech because it makes them more efficient. This, in turn will help them survive the downturn. We’re not sure whether to file this under irony or hypocrisy, but SAP is – you guessed it – halting new spending on information technology. “We will review all planned investments in IT equipment, hardware, software, facilities, and company cars, as well as internal IT projects,” the co-CEOs wrote in the email. “Do not order any new equipment at this time.”
The email captures the uncertainty at SAP – uncertainty that is no doubt shared by other companies in the industry. “No one at this point can say how markets and customers will react in the coming months,” the email says. “In this turbulent economic environment, we will be giving added attention to sustaining our margin and earnings health.”
Aside from halting its tech spending, here’s how SAP plans to do that, as taken verbatim from the email:
* Headcount and Hiring Freeze: “There is a complete headcount and hiring freeze, and all existing job vacancies will be canceled. This includes any temporary workers, interns, and students. There will be no replacements for employees leaving SAP. No internal transfers may take place. Only those written offers sent to a candidate and/or internal transfers agreed to on or before October 7, 2008, will go forward.”
* Third-Party Expenses: “Since we are not hiring, all engagement with external recruiters must cease immediately. We will discontinue engagement with management consultants and evaluate the impact this has on ongoing projects. Until further notice, all external training is to be canceled. Internal meetings must be held within SAP buildings, and you cannot rent external conference facilities for this purpose.”
* Travel: “Cease ALL internal non-customer-facing travel in October…Any non-customer-facing travel already booked should be canceled immediately, even if this incurs penalties.” SAP sales people will also have to fly coach from now on unless they use miles to upgrade.
jueves, octubre 16, 2008
Mas noticias de Oslo
En primer lugar, los nuevos anuncios:
The company is today [10 de octubre] due to announce M, for building textual domain-specific languages (DSLs) and software models with XAML. Microsoft will also announce Quadrant, for building and viewing models visually, and a repository for storing and combining models using a SQL Server database.Gavin Clarke, para The Register, destaca el impulso que Oslo implica para las Software Factories de Microsoft. Los subrayados en verde, como siempre, son míos:
(...) Wahbe said [Quadrant] will be a generic editor that lets you view a model as a list of nodes, a tree, graph, or boxes of lines laid our for a workflow and combine views to visualize the model.
(...) Underpinning this will be the repository, where you will be able to combine models. Wahbe said models could be combined because they'd be translated into their basic elements - tables. The repository will support models from third parties and those built using UML in addition to those constructed in M and Quadrant.
Kathleen Richards, en Application Development Trends, dice sobre Oslo:Oslo is Microsoft's big new push to give developers tools and techniques in modeling and code re-use to improve the design, build, and testing of software. The SOA tie-in comes through the ability to combine models with things such as Microsoft's Windows Workflow Foundation (WF) for process and workflow.
It's a fresh take on an existing model and code re-use concept. Some years back, Microsoft was pushing the idea of Software Factories that featured in the then-new Visual Studio Team System (VSTS).
Software Factories used DSLs and XML for you to build re-usable models that punched out code for specific uses - or domains such as, say, HR in banking.
The factories idea came as it attempted to "democratize" application lifecycle management (ALM) with VSTS by making tools for modeling and testing easier to use. Also, Microsoft was adopting the idea of working with highly tuned DSLs rather than the more generic industry standard Unified Modeling Language (UML), which was seen by some as bloated with version 2.0.
While an admirable strategy, Software Factories have not taken off as Microsoft had probably hoped, and modeling remains a pursuit not of coders but mostly of architects and designers.
Robert Wahbe, corporate vice president of Microsoft's connected systems division, told The Reg that Oslo would give Software Factories a "huge boost" as it would make models easier to build both textually and visually. Wahbe said M will provide unique services not available in other languages - in declarative keywords and syntax and how the DSL is translated and stored in the repository.
Oslo is designed to unify Microsoft's modeling technologies. It is likely to be the successor to the company's modeling tools such as the Web Services Software Factory: Modeling Edition, which offers guidance and code generation in Visual Studio 2005 and Visual Studio 2008.Kathleen agrega un aspecto probable en cuanto al objetivo de Quadrant: su utilización como herramienta de modelado para analistas de negocio:First announced in October 2007, Oslo will enable .NET developers to more easily create domain-specific languages -- custom mini-languages, often industry- or component-related, that can be used to solve similar problems in a common domain. Microsoft's existing DSL toolkit, which focuses on graphical DSL, first appeared as an extension to Visual Studio 2005 and was baked into the Visual Studio 2008 SDK.
Oslo extends the existing DSL Toolkit; adds support for UML, BPMN and BPEL via a visual designer; and stores the artifacts and conceptual diagrams in a common SQL Server database. The goal is to enhance traditional imperative programming techniques with higher-level models to improve flexibility in terms of extensibility and changing the behavior of the app, as well as to increase transparency and productivity.
Todavía sigue pareciendo un bosquejo. Sigue produciendo(me) la impresión de ser un modo de llegar al diseño dirigido por modelos a partir de herramientas de tercera generación que se desea preservar.Data-driven app developer Roger Jennings, a contributing editor to Redmond Developer News' sister publication Visual Studio Magazine and OakLeaf Systems blogger, said that with Oslo, Microsoft is moving to a repository for componentized software. The company's intention is to make the visual designer Quadrant usable by business analysts -- to modify workflows, for example -- but whether that will work is unclear. "My feeling about Oslo is that what they are trying to do is get business analysts involved in the design process, but not necessarily doing the design," Jennings said.
According to Wahbe, the first version of Oslo is definitely targeted at the development community. For business analysts and end users, he said, "What we expect is that those models will be surfaced naturally within their existing toolsets." For example, SharePoint has document approval that "under the covers" is using a model and workflow but that is not directly exposed to the user.
However, he added, "The tool that we've built is very easy to use and we imagine over time analysts and IT using that tool."
lunes, octubre 13, 2008
Cómo afectará la crisis al software?
Entonces, quizá no venga mal analizar un poco qué pasará con la industria del software.
Mirando las cotizaciones de acciones en las principales bolsas, no se ve a las mayores empresas de software entre las principales afectadas, salvo algunas excepciones. Sin embargo, un enfriamiento de la economía implica posiciones conservadoras de parte de todas las actividades económicas, servicios, y administración estatal: ahorro, recorte de proyectos, simplificación de objetivos. Existe otro aspecto contrapuesto, ya que a mayor necesidad, se requeriría aguzar el ingenio para sobrevivir, y quizá haya lugar para innovaciones de retorno efectivo.
Pero los proyectos dispendiosos, ¿continuarán? Aquellos que dependan de las administraciones gubernamentales o de grandes corporaciones, estarán bajo celoso seguimiento probablemente, si acaso consiguen presupuesto.
Probablemente habrá que seguir de cerca la evolución de los grandes ERPs, por ejemplo: Oracle, SAP, que por de pronto, han presentado algunos signos de dificultad.
¿Qué pasará con el outsourcing? Tendrán prioridad los grandes centros, o se pronunciará la tendencia al outsourcing más cercano, en regiones próximas? Latinoamérica se beneficiaría frente a Asia.
Por unos días, trataremos de tomar casos y opiniones: estamos en el borde de cambios de varios años.
viernes, octubre 10, 2008
Outsourcing de trabajo creativo: en Buenos Aires
Lo más importante de la nota:
Outsourcing is becoming more and more common place for companies and clients in the developed world. The Internet, email, and lately collaborative applications like Skype, ClockingIT, Basecamp, and WebEx have made it easier to outsource work that can be sent digitally. Creative Work has long been considered difficult to outsource since it needed close contact between client & provider and is often culture specific. Creative work is now the new outsourcing threshold. Where can you best outsource Creative Work? To India where the majority of programming is going, or to China, the manufacturing giant? What about Eastern Europe or Mexico? Finally all water flows to the lowest point. The lowest point is defined as the place where we find the best price and quality relationship. We have reason to believe that Argentina will become the global center for Creative Outsourcing. The abundance of creative talent, highly educated specialists, Western culture and low cost make Argentina ideal for the outsourcing of Web Design, Graphic Design, Flash, 3D, Games and Video.Podríamos decir que este artículo fue escrito a mediados de septiembre, o algo antes. La visión optimista de un mercado creativo creciente está ahora para ser revisado, frente a una probable recesión de mediana o larga duración. Sin embargo los recursos humanos están allí, avanzando a pesar de las dificultades de todo tipo. Ciertamente, la idea de outsourcing se generalizó en Argentina como una alternativa e iniciativa en las peores circunstancias. Quizá también encuentren su camino en este nuevo escenario próximo.
What makes Argentina so well suited for Creative Outsourcing?
1. Economic setting
Companies in Argentina only started to export services after the big depreciation of its currency at the end of 2001. Let me explain why. Up until the late 1980s telecoms companies were monopolies or state owned and famous for their inefficiency. It took the average Argentinean citizen up to 15 years to get a telephone land line connection. Then, in 1989 President Menem came to power. He did two things that eventually turned Argentina into an export country for services. First, he freed up the telecoms market. The state owned companies were sold to European buyers. Companies like Telecom from Italy and Telefonica from Spain invested heavily in a modern telecoms infrastructure. They placed fiber optic lines throughout Buenos Aires and connected the major urban centers. Within a few years people had easy access to modern telephone systems. This alone, however, did not turn Argentina into an outsourcing power house. Menem's second priority was to curb the country's endemic inflation. The Argentine government attached the value of the Argentine Peso to the US Dollar in equity of 1 Peso to 1 Dollar. This meant that, by law, the Peso could not decrease in value more than the Dollar. Hyperinflation came to an immediate standstill but prices kept rising, albeit at a slower pace. Soon a Coca Cola cost more in Buenos Aires than in New York. Argentina started to import more and more and its local industry began to perish. Argentina could not compete with the modern and highly capital intensive industries of Western Europe and the United States. To matters worse, Menem was unable to balance the books. The government spent more money than it received. The holes were filled with loans which resulted in an ever growing debt burden. Finally, at the end of 2001 the system collapsed. The country defaulted on its loans and its currency depreciated 3 to 1 to the dollar. This left Argentina with a modern telecom infrastructure, a competitive telecom market, highly skilled labor and wages that could be compared to India, but no one to benefit from these enormous opportunities. The first players to jump into the market were international call center giants like Teleperfomance and Teletec. Soon after, companies that included services like programming and web development were investing time and resources into Argentina. Everything that could be sold digitally prospered. You might wonder where this new situation will lead. What is the comparative advantage of Argentina over the rest of the world? While browsing local web pages and observing the quality of local television ads you will find that Argentina has an excellent price and quality for creative services. Creativity or Western culture cannot be taught in a university. It is something deeply ingrained into a country's genes. Where India and China can beat the world in programming and manufacturing, Argentina can be best at providing high quality creative services with a Western style. Let's look a bit more into the elements that make Argentina well suited to deliver creative services.
2. Abundance of Creative talent
When hiring creative talent in Argentina you will be surprised at the number and quality of people that apply for a creative position. The abundance of highly qualified people that look for work as a web designer, graphic designer, 3D specialist, animator and video developer is just stunning. Upon a closer look to the existing creative outsourcing industry you will find a plethora of very professional companies with great design capacity that you will not see in other countries like The United States or Europe. Digging deeper, you will find that Argentina has always had the highest cultural standing in South America. Buenos Aires is the center of South America for cabaret, cinema and TV productions. Recently MTV decided to move their regional head quarters to Buenos Aires. As one can guess, MTV is one of the most creative companies worldwide. MTV Networks' CEO Robert Bakish commented that "Exporting Argentine creativity will enforce our commitment to create the best content for our audience".
3. Highly Educated Population
Historically, Argentina has the best educational system in South America. There is great availability of highly skilled professionals, especially in the field of IT, as this sector has developed significantly in Argentina over the past few years. Argentineans have always been very successful abroad and are well-known for their professionalism.
4. Western Style favors Argentina
98% of Argentineans are of European descent with the majority being a mix between Italians and Spanish immigrants. Argentina has always looked at Europe as their point of reference, and this resulted in Buenos Aires claiming the title of the "Paris of South America". Visitors are unanimously enthusiastic about this cosmopolitan city with its architecture, night life, cultural activity, tango, and fashion industry with its creative flair and international style.
5. Wage prices can be compared to India
Since the end of 2001 the country has recovered most of the jobs that were lost and some of its purchasing power. Salaries for highly qualified creative people still run at around 700 USD per month. Due to the abundance of people aspiring to work in the creative sector the wages remain stable. Exceptional people accept relatively low wages just to be able to work and gain experience in the industry. Furthermore, Buenos Aires is one of the cheapest capitals in the world. Salary costs in Argentina are 60% lower than the average salary costs in Eastern Europe, which are 15% higher than the average wages in India.
6. Creative Outsourcing Market booming
Argentina has seen an enormous boom in its outsourcing industry since 2001. Companies like Connaxis (www.creative-outsourcing.com); Latin3 (www.latin3.com), WebAr (www.webar.com), WUNDERMAN (www.wundermandigital.com.ar), crossmedia (www.crossmedia.com), Boogie Man Media (www.boogiemanmedia.com), XAGA (www.XAGA.com), 451 (www.451.com), Bridger Conway (www.bridgerconwayinteractive.com), Group94 (www.group94.com), and Gameloft (www.gameloft.com) have experienced significant exponential growth over the past 6 years. Lately, companies like MTV made Buenos Aires its local Latin American hub to produce all creative work for the Latin American Market. Connaxis is the first company to recognize Creative Outsourcing as one of the countries greatest strategic long term opportunities. Connaxis CEO, Peter van Grinsven, commented that "The strength of Argentina lies in its abundance of creative talent, highly educated specialists, Western culture and low cost". Latin3 is the premier provider of "Exponential Marketing" services in Latin America and the U.S. Hispanic markets. This company is a creative outsourcing giant, employing 120 professionals and developing creative work for companies like Cisco Latin America, Dell Latin America, Lexicon, Microsoft Latin America, Nextel International, Pepsi Latin America and many more. Gameloft has been developing mobile games in Argentina since 2001 and expanded its staff to over 400 professionals in 2008. Recently, MTV has announced its plan to create 200 new full time work positions. MTV Buenos Aires will become the creative center of Latin America for MTV, Vh1 and Nickelodeon, with the responsibility of creating a portion of all the creative work done for cinema.
The success of these companies is only the beginning. It is a story in the making. Argentina is transforming itself to be the creative outsourcing center of the world. It is already a success story for the first mover companies described in this article. These companies are experiencing exponential growth and worldwide recognition for creative skills delivered to their international clients. It is not a story of selling web design but one of 'creativity'; it is something more abstract that originates from deep within a country's culture and history. The mix of citizens with western European dissent, the particular history, and the recent introduction of email and the Internet is generating a new industry of Creative Outsourcing in Argentina.
El móvil en Japón
jueves, octubre 09, 2008
Otro elogio del papel tecnológico de Argentina
Como no podía ser de otra manera (y ojalá un día alguien lo asuma), una de cal y otra de arena. El primer comentario a la nota de Vinnie apunta a las contras:"Outsourcing of Creative Work is to Argentina what Outsourcing of Programming is to India..."
While this report I wrote about last week, basically said forget country, focus on cities when it comes to global sourcing, a press release I saw this week says - wait, it's not either or. Country competence is still important when it comes to tech. It was talking about Argentina's focus on creative web design.
"The abundance of highly qualified people that look for work as a web designer, graphic designer, 3D specialist, animator and video developer is just stunning. ...Argentina has always had the highest cultural standing in South America. Buenos Aires is the center of South America for cabaret, cinema and TV productions. Recently MTV decided to move their regional head quarters to Buenos Aires. As one can guess, MTV is one of the most creative companies worldwide."
Fair point, and as we look around in technology there are definite signs of country competence.
For example - before Argentina gets too confident about its creative juices, check out this article about animation skills in Korea. And notice the article was dated over a decade ago. Taiwan, which contributes many components to the iPhone, and is home to HTC has like Finland, with Nokia made a name for itself in mobile world. Singapore has been establishing itself as a bio-medical hub. Israel leads in various security areas - though Estonia has been building its own cyber-security reputation. Brazil is being called the "Saudi Arabia of biofuels" - I saw ethanol used there in cars on a visit in 1984. Dubai is leading the Middle East with its innovation and architecture, construction and engineering. Philippines has made a name around call centers and increasingly other BPO.
I could go on and on.
Ok, country and city choice are both pretty important when it comes to global sourcing. And of course the vendor, and probably most importantly the vendor team.
Argentina has extraordinarily difficult labor laws and unionization to navigate. Recommending it as a destination needs to be researched carefully with local labor attorneys and companies doing business there - even if you're going through a vendor.
While it's a wonderful and fascinating place to visit, their productivity and communication isn't as 'mature' as other destinations.
I agree cities play a major part in choice - there is a huge difference between Chenai and Bangalore (grid, transportation, competitive culture).
martes, octubre 07, 2008
Industria del Software: Misión española en Buenos Aires
- Los argentinos están muy preparados. Hay una base de empresas importantes, con gente formada, buenas universidades, abundancia de ingenieros y entramado empresarial en todos los niveles –de grandes multinacionales a PYMES-.
- Tanto la economía como la población tiene una gran concentración en el Gran Buenos Aires que suma casi la mitad de la población y buena parte de las empresas. No obstante oras ciudades como Córdoba, Mendoza, Rosario o La Plata tienen empresas pujantes que, en muchas ocasiones, son contratadas por las ubicadas en la capital.
- Hay una apuesta nacional por el desarrollo de software y ello ha implicado una ley a la medida, la creación de un barrio tecnológico al estilo @22 y fuertes estímulos a la exportación con la creación de la ley de Promoción de la Industria del Software y Servicios Informáticos
- Muchas empresas ya están exportando e incluso adaptan sus horarios para coordinar la relación, entrando algunos de ellos a las 6 de la mañana para incrementar las horas simultáneas con clientes europeos.
- La mayoría de los exportadores tienen ya una buena base de clientes en EEUU incluyendo un abanico muy amplio. Desde Google –aquí se ha desarrollado parte de CheckOut y OpenSocial- a PYMES de un par de empleados.
- La mayoría de los exportadores tienen en las empresas de desarrollo de software y diseño web a sus principales clientes.
- Las empresas bonaerenses empiezan a tener problemas en captar personal cualificado y muchos de ellos tienen freelances o otras empresas fuera de la zona a las que subcontratan en puntas de trabajo.
- La forma de operar con ellos puede ser tanto por proyecto –precio cerrado sobre un desarrollo concreto- como por horas –contratación de recursos cualificados por horas-. El coste de hora de un programador con más de 3 años de experiencia ronda de los 12 a los 25 euros.
- Hay muchas más empresas con conocimiento en Java y .Net pero pocas en PHP o Ruby.
Como no podía ser de otra manera, en otro post describe los puntos débiles y cargas que se deben afrontar. Y como en otros sectores que podrían aportar crecimiento a la economía, encontramos la carga que reclama el Estado. Es de observar que las iniciativas que ayudaron a salir de la crisis, en general han nacido del esfuerzo de particulares, con poca colaboración de la Administración. Así, apunta Antonio:
Siempre nos quejamos de impuestos, ya sea como aprticulares o como empresarios. Así que, aunque sea un mal consuelo, os dejo unos datos de como está la situación impositiva en Argentina.
Toda empresa debe pagar lo siguiente:
Impuesto sobre ingresos brutos. Un 3% directo sobre la facturación a liquidar mensualmente -independientemente de tus beneficios-.
Impuesto sobre las transferencias. Un 0,6% con cada movimiento de ingreso o pago en cuenta. En total, un 1,2% que el banco cobra y liquida posteriormente al estado.
Impuesto de utilidades o nuestro equivalente a sociedades. 35% sobre el beneficio.
IVA o Impuesto sobre el Valor Añadido. 21% del total.
En especial me asombra ese 4,2% sobre las ventas entre los dos primeros impuestos.
Para más dolor, añadirle que cualquier dinero que se recibe del extranjero debe de ser canalizado a través de un banco central que despues retrasa el pago hasta tu cuenta final entre 15 y 30 días. Esto es, un cliente español paga a un banco que actúa como consolidador y controlador para después liquidar a la cuenta del empresario argentino.
martes, septiembre 30, 2008
Plex en Eclipse
De su comunicación en el foro de Plex:
Como Christopher dice, en la Wiki de Plex se puede encontrar más información:The complete Eclipse project for Plex Services for Eclipse is now hosted at SourceForge.
The project is located at http://sourceforge.net/projects/plexservices/
The subversion repository is located at https://plexservices.svn.sourceforge.net/svnroot/plexservices
This project is licenced with the Apache 2.0 open source licence so please use it and contribute.
If you want to just install the plugin, an Eclipse update site is available at http://adcaustin.com/eclipse/
There is a half completed setup page on the Plex WIKI.
Please remember, I'm doing this in my spare time. If it needs a feature or a fix, I will try and help as best as my schedule allows.
If you have a question on how you can extend it and want to share, I will be MORE than happy to answer.
The plugin currently requires at least Eclipse 3.3 and I'm targeting my future development for 3.4.
How does Eclipse relate to Plex?
Plex can generate Java source code for very functional and usable applications. Building, Debugging, Deploying and Managing the generated source is not an easy or straight forward task. Eclipse can handle these tasks for you.
Plex 5.X ships with Microsoft's NMAKE as it's Java Builder (6.0 uses ANT). Building a large Applications can take a very long time. Eclipse can build your application in a fraction of the time. Diagnosing build errors with the compile listings from NMAKE is clumsy, if not impossible. Finding and discovering how to correct build errors in Eclipse is quite simple.
Debugging Java without an IDE can be very difficult. Debugging is inherent part of the Eclipse Platform.
Deploying a Plex Java application is a cumbersome manual process. Eclipse, along with the integrated ANT support, can automate the whole process. You can even create J2EE projects in Eclipse to automate the packaging and deployment of Plex EJB Proxy applications to J2EE servers.
Managing a single developer environment of Plex generated source code and other required artifacts is difficult to impossible, never mind trying to do so in a multi-developer environment. Eclipse provides a "workspace" for each developer and source repository support for source control products like CVS and Subversion.
domingo, septiembre 28, 2008
Windows 7 en el camino del Vista?
Es probable que en estos términos, Windows 7 tenga viscisitudes parecidas a las de Vista..."We create feature teams with n developers, n testers, and 1/2n program managers," Sinofsky wrote in a four-page blog that introduced his views on managing large-scale software development. "On average a feature team is about 40 developers across the Windows 7 project."
Based on that arrangement, each feature team would appear to have about 40 developers writing code, an equal number of beta testers -- which Sinofsky separately described as "software development engineers in test" -- and about 20 program managers.
In other words, that would be 2,000 developers creating or testing Windows 7 code, overseen by 500 managers.
Microsoft's public relations firms declined to confirm or clarify those figures.
viernes, septiembre 26, 2008
(Thinking in Objects)
Henry propone un ejemplo (imperfecto y criticado por varios de sus lectores):In order to be able to have a mental theory one needs to be able to understand that other people may have a different view of the world. On a narrow three dimensional understanding of 'view', this reveals itself in that people at different locations in a room will see different things. One person may be able to see a cat behind a tree that will be hidden to another. In some sense though these two views can easily be merged into a coherent description. They are not contradictory. But we can do the same in higher dimensions. We can think of people as believing themselves to be in one of a number of possible worlds. Sally believes she is in a world where the ball is in the basket, whereas Ann believes she is in a world where the ball is in the box. Here the worlds are contradictory. They cannot both be true of the actual world.
To be able to make this type of statement one has to be able to do at least the following things:
- Speak of ways the world could be
- Refer to objects across these worlds
- Compare these worlds
Henry aboga por el uso de RDF (Resource Description Framework) como herramienta capaz de expresar adecuadamente las distintas vistas de un problema, en un universo de vistas en lugar de objetos. Pero sin duda, esto merece otro espacio.Let us illustrate this with a simple example. Let us see how one could naively program the puppet play in Java. Let us first create the objects we will need:
Person sally = new Person("Sally");So far so good. We have all the objects. We can easily imagine code like the following to add the ball into the basket, and the basket into the room.
Person ann = new Person("Ann");
Container basket = new Container("Basket");
Container box = new Container("Box");
Ball ball = new Ball("b1");
Container room = new Container("Room");basket.add(ball);Perhaps we have methods whereby the objects can ask what their container is. This would be useful for writing code to make sure that a thing could not be in two different places at once - in the basket and in the box, unless the basket was in the box.
room.add(basket);Container c = ball.getImmediateContainer();All that is going to be tedious coding, full of complicated issues of their own, but it's the usual stuff. Now what about the beliefs of Sally and Ann? How do we specify those? Perhaps we can think of
Assert.true(c == basket);
try {
box.add(ball)
Assert.fail();
} catch (InTwoPlacesException e) {
}sallyandannas being small databases of objects they are conscious of. Then one could just add them like this:sally.consciousOf(basket,box,ball);But the problem should be obvious now. If we move the ball from the basket to the box, the state of the objects in
ann.consciousOf(basket,box,ball);sallyandann's database will be exactly the same! After all they are the same objects!basket.remove(ball);There is really no way to change the state of the ball for one person and not for the other,... unless perhaps we give both people different objects. This means that for each person we would have to make a copy of all the objects that they could think of. But then we would have a completely different problem: namely deciding when these two objects were the same. For it is usually understood that the equality of two objects depends on their state. So one usually would not think that an physical object could be the same if it was in two different physical places. Certainly if we had a ball b1 in a box, and another ball b2 in a basket, then what on earth would allow us to say we were speaking of the same ball? Perhaps their name, if it we could guarantee that we had unique names for things. But we would still have some pretty odd things going on then, we would have objects that would somehow be equal, but would be in completely different states! And this is just the beginning of our problems. Just think of the dangers involved here in taking an object from
box.add(ball);
Ball sb = sally.get(Ball.class,"b1");
Assert.true(box.contains(sb));
//that is because
Ball ab = ann.get(Ball.class,"b1");
Assert.true(ab==sb);ann's belief database, and how easy it would be to by mistake allow it to be added tosally's belief store.
De los comentarios que siguen al artículo, particularmente abre un poco más el que hiciera Ryan, y la respuesta de Henry
Ryan:
I really like your analysis of OOP and it's relation to autism. I have never thought of it in such a way, but it does make a lot of sense (when I am able to remove my preconceptions). However, why is autism bad in this scenario (if you are even implying it)? I can understand that if our perspective is not an omniscient one then this can fail us. Would you please provide a applied example of the problem at hand with relation to your point on Semantic Web? Thanks a lot!Henry:
En el mismo sentido, apunta Benjamin Damman:Thanks for asking Ryan. Yes there are a lot of examples. The following article "Extending the Representational State Transfer (REST) Architectural Style for Decentralized Systems" which you can find here
http://portal.acm.org/citation.cfm?id=999447
makes the point about how the distance between the source of a message and the recipient of a message makes perfect immediate communication impossible, if you think of it as resources having the same access to an object. But if you think of it as message passing then you can do some interesting things... Ok I read that quickly, but that is what made me decide to write this article out today.In the AddressBook I am writing, which I describe in an audio slide cast here:
http://blogs.sun.com/bblfish/entry/building_secure_and_distributed_social
I need to get data from distributed places around the web. This can only be done seriously if you accept that there will be spammers, liars, and just simply wrong data out there. So though you may by default merge data, you may want to make it easy to unmerge it too. I wrote about that in more detail here:
http://blogs.sun.com/bblfish/entry/beatnik_change_your_mindAs I said, if you are writing tools, that you can think of as physical, mechanical objects, that don't have to have points of view on the universe, say if you are writing a web browser, a calculator, or some such thing, then this is not important. But as soon as you want to mesh the information on the web, you will need to take the opinion of others into account. We are fast moving to a world where this is going to become more and more important.
In any case it is good to know the limitations of your tools. :-)
Un papel sugerente de ideas, mas allá de los puntos observados por su crítica.Hmmmm. Parts of your intriguing article made me think of erlang.
"When one fetches information from a remote server one just has to take into account the fact that the server's view of the world may be different and incompatible in some respects with one's own. One cannot in an open world just assume that every body agrees with everything. One is forced to develop languages that enable a theory of mind. A lot of failures in distributed programming can probably be traced down to working with tools that don't."
Erlang was created for coding highly fault-tolerant (and distributed) systems; characteristics stemming this fact might make it an example of a language that 'enables a theory of mind.'
martes, septiembre 16, 2008
El proyecto Oslo
Ron Jacobs dice, anunciando su presentación junto a David Chappell:
Microsoft's "Oslo" project aims at creating a unified platform for model-based, service-oriented applications. This new approach will affect the next versions of several products and technologies, including the Microsoft .NET Framework, Microsoft Visual Studio, Microsoft BizTalk Server, Microsoft System Center, and more. Although many details of "Oslo" won't be public until later in 2008, this session provides an overview of what Microsoft has revealed so far. Along with a description of the problems it addresses, the session includes a look at several new "Oslo" technologies, including a general-purpose modeling language, role-specific modeling tools, a shared model repository, and a distributed service bus.
Uno de los nuevos elementos de Oslo, es el impulso al lenguaje D. Darryl Taft dice:
“The language was designed with an RDBMS [relational DBMS] as very, very, very much top-of-mind, so that we have a very clean mapping,” Lovering said. “But the language is not hard-wired to an RDBMS or relational model. And the language is actually built against an abstract data model. We represent the program itself also in that same abstract data model, which is a very LISP-ish idea—you know, where the whole program itself is the same data structure on which it operates.”En su sitio dedicado a SOA, se resume así las características de Oslo:
”Oslo” is the codename for Microsoft’s forthcoming modeling platform. Modeling is used across a wide range of domains and allows more people to participate in application design and allows developers to write applications at a much higher level of abstraction. “Oslo” consists of:
- A tool that helps people define and interact with models in a rich and visual manner
- A language that helps people create and use textual domain-specific languages and data models
- A relational repository that makes models available to both tools and platform components
Tres elementos se destacan, en los adelantos que funcionarios y allegados a Microsoft van develando: la mencionada utilización de un nuevo lenguaje (D), el énfasis en el modelado y la abstracción, y la idea de un repositorio que ordene los recursos participantes. No es algo nuevo (la idea del repositorio como sustento de las herramientas de modelado ya había generado iniciativas de Microsoft y otros en los 90), pero el conjunto es aplicado sobre recursos que han madurado y sobre los que se ha discutido mucho ya.
En el sitio de Microsoft sobre SOA, algunas ideas expuestas por Bob Muglia, arrojan luz sobre el futuro que Oslo traerá:
Esperaremos más noticias...“Oslo” and a Mainstream Approach to Modeling
Modeling has often been heralded as a means to break down technology and role silos in application development to assist IT departments in delivering more effective business strategies. However, while the promise of modeling has existed for decades, it has failed to have a mainstream impact on the way organizations develop and manage their core applications. Microsoft believes that models must evolve to be more than static diagrams that define a software system; they are a core part of daily business discussions, from organizational charts to cash flow diagrams. Implementing models as part of the design, deployment and management process would give organizations a deeper way to define and communicate across all participants and aspects involved in the application lifecycle.
In order to make model-driven development a reality, Microsoft is focused on providing a model-driven platform and visual modeling tools that make it easy for all “mainstream” users, including information workers, developers, database architects, software architects business analysts and IT Professionals, to collaborate throughout the application development lifecycle. By putting model-driven innovation directly into the .NET platform, organizations will gain visibility and control over applications from end-to-end, ensuring they are building systems based on the right requirements, simplifying iterative development and re-use, and enabling them to resolve potential issues at a high level before they start committing resources.
Modeling is a core focus of Microsoft’s Dynamic IT strategy, the company’s long-term approach to provide customers with technology, services and best practices to enable IT and development organizations to be more strategic to the business. “Oslo” is a core piece of delivering on this strategy.
“The benefits of modeling have always been clear, but traditionally only large enterprises have been able to take advantage of it and on a limited scale. We are making great strides in extending these benefits to a broader audience by focusing on three areas. First, we are deeply integrating modeling into our core .NET platform; second, on top of the platform, we then build a very rich set of perspectives that help specific personas in the lifecycle get involved; and finally, we are collaborating with partners and organizations like OMG to ensure we are offering customers the level of choice and flexibility they need.”
Bob Muglia, Senior Vice President, Microsoft Server & Tools Business
jueves, septiembre 11, 2008
Microsoft ingresa en OMG
InfoQ le dedica un artículo que reseña los antecedentes, y algunas de las voces que aquí también se comentaron.REDMOND, Wash. — Sept. 10, 2008 — Microsoft Corp. today outlined its approach for taking modeling into mainstream industry use and announced its membership in the standards body Object Management Group™ (OMG™). Modeling is a core focus of Microsoft’s Dynamic IT strategy, the company’s long-term approach to provide customers with technology, services and best practices to enable IT and development organizations to be more strategic to the business.
Modeling often has been heralded as a means to break down technology and role silos in application development to assist IT departments in delivering more effective business strategies. However, although the promise of modeling has existed for decades, it has failed to have a mainstream impact on the way organizations develop and manage their core applications. Microsoft believes that models must evolve to be more than static diagrams defining a software system; they are a core part of daily business discussions, from organizational charts to cash flow diagrams. Implementing models as part of the design, deployment and management process would give organizations a deeper way to define and communicate across all participants and aspects involved in the application life cycle.
To make model-driven development a reality, Microsoft is focused on providing a model-driven platform and visual modeling tools that make it easy for all “mainstream” users, including information workers, developers, database architects, software architects, business analysts and IT professionals, to collaborate throughout the application development life cycle. By putting model-driven innovation directly into the Microsoft .NET platform, organizations will gain visibility and control over applications from end to end, ensuring that they are building systems based on the right requirements, simplifying iterative development and re-use, and resolving potential issues at a high level before they start committing resources.
“We’re building modeling in as a core part of the platform,” said Bob Muglia, senior vice president, Server and Tools Business at Microsoft. “This enables IT pros to specify their business needs and build applications that work directly from those specifications. It also brings together the different stages of the IT life cycle — connecting business analysts, who specify requirements, with system architects, who design the solution, with developers, who build the applications, and with operations experts, who deploy and maintain the applications. Ultimately, this means IT pros can innovate and respond faster to the needs of their business.”
OMG has been an international, open-membership, not-for-profit computer industry consortium since 1989. OMG’s modeling standards include the Unified Modeling Language™ (UML®) and Business Process Management Notation (BPMN™). In addition to joining the organization, Microsoft will take an active role in numerous OMG working groups to help contribute to the open industry dialogue and assist with evolution of the standards to meet mainstream customer needs. For example, Microsoft is already working with the finance working group on information models for insurance business functions related to the property and casualty industry, and will eventually look to expand those models so that they can be applied to P&C, life and reinsurance. Another early focus will be on developing specifications for converting messages across the various payments messaging standards.
“Microsoft has always been one of the driving forces in the development industry, helping to make innovation possible but also simplifying many of the most challenging aspects of the application development process,” said Dr. Richard Mark Soley, CEO at OMG. “In less than 10 years, OMG’s UML, a cornerstone of the Model Driven Architecture initiative, has been adopted by the majority of development organizations, making OMG the seminal modeling organization and supporting a broad array of vertical market standards efforts in healthcare, finance, manufacturing, government and other areas. Microsoft’s broad expertise and impact will make its membership in OMG beneficial to everyone involved.”
Developers can begin to implement model-driven approaches today through innovations such as Extensible Application Markup Language (XAML) — the declarative model that underlies Windows Presentation Foundation and Windows Workflow Foundation — and ASP.NET MVC, which deeply integrates model-driven development into the .NET Framework and makes it easy to implement the model-view-controller (MVC) pattern for Web applications. Both XAML and MVC are examples of models that drive the actual runtime behavior of .NET applications. These are part of Microsoft’s broader companywide efforts to deliver a connected platform modeling, which includes technologies being delivered across both “Oslo” and Visual Studio “Rosario” initiatives.
(En Microsoft PressPass)
lunes, septiembre 08, 2008
Google Chrome en InfoQ

En estos días, varios millones de entusiastas están probando Chrome, el browser de Google (me incluyo), en su primer lanzamiento público. Evidentemente, se ha lanzado una carga de profundidad en el mercado, que, como otros productos de su dueño, apenas comienza, y mucho más veremos.
Geoffrey Wiseman, en InfoQ, publica un breve pero abarcador artículo sobre estado y perspectivas, que es conveniente leer.
En cuanto al escenario en la industria, Wiseman estima:
En su resúmen técnico, Wiseman escribe:Many people have heralded the launch as the renewal of the browser wars once fought between Microsoft and Netscape / Mozilla (those were the primary contenders, although every browser has its contigent willing to trumpet its strengths). Some are willing to count Chrome out already, while others are adopting a wait and see stance.
Many argue that Google doesn't wish to compete with other browsers, simply to advance the state of network-delivered applications to where they are indistinguishable from desktop applications and in so doing, push the operating system into the background.
In particular, people telling this story love to cast Microsoft in the opposing role, so that one can imagine the two titans clashing.
Por mi parte, no reemplazaré (por ahora) a Firefox, porque aún Chrome es incompleto para algunos de los usos que hoy mantengo en Firefox, pero sus ventajas por ahora son innegables, y en primer lugar, en performance. En Septiembre, la lucha ha comenzado.The Chrome browser is the result of the Chromium project, which connects the WebKit web browser engine with the new Google V8 JavaScript Engine, the Skia vector graphics engine, and Google Gears.
The WebKit browser engine began its life as a fork of the KDE project's KHTML and KJS engines by Apple, becoming the basis of the Safari browser. WebKit was later re-adopted by KDE. Google already employs WebKit within their Android mobile phone platform, and it became the obvious solution for them. As the comic introduction to Chrome states:
It uses memory efficiently, was easily adapted to embedded devices, and it was easy for new browser developers to learn to make the code base work. Browsers are complex. One of the things done well with WebKit is that it's kept SIMPLE.
The version of WebKit used in the initial Windows beta seems to be WebKit 525.13, which is not the most recent version, and has some security vulnerabilities (see Security below). Some users have also noticed rendering differences from Safari's WebKit rendering to Chrome's, including antialiasing and shadows. This may be the result of the Skia graphics engine used under the hood.
Talking about the integration with WebKit, the Chromium FAQ says:
The Chromium source code includes a copy of the WebKit source. We frequently snapshot against the WebKit tip of tree or specific branches according to our release needs.
Our goal is to reduce the size and complexity of the differences between the copy we maintain in order to work more effectively as a participant in the WebKit community and also to make periodic updates occur more smoothly.
The V8 JavaScript Engine is open-source and hosted on Google Code, but was written for Chrome, rather than adopting an existing JavaScript engine. V8 is written in ~100,000 lines of C++ and can be run standalone or embedded in C++ applications.
The foremost reason for V8's creation seems to be performance. The V8 Design Documentation states, "V8 is ... designed for fast execution of large JavaScript applications." The Chromium Blog on V8 is entitled "The Need for Speed" and states:
Google Chrome features a new JavaScript engine, V8, that has been designed for performance from the ground up. In particular, we wanted to remove some common bottlenecks that limit the amount and complexity of JavaScript code that can be used in Web applications.
V8 claims a number of performance improvements and innovations, from fast property access using hidden classes, dynamic machine code generation and efficient garbage collection (stop-the-world, generational, accurate, compacting), small object hreaders, multi-threaded from the ground up. The team that created V8 was headed by Lars Bak, who, as Avi Bryant says, was "the technical lead for both Strongtalk and the HotSpot Java VM, and a huge contributor to the original Self VM" and has a number of VM-related patents to his name.
V8 is not a virtual machine in the classic sense as Matthieu Riou points out: there's no intermediate representation or byte-code. As a result, you cannot write your own language that compiles to "V8 byte code", although you can cross-compile to JavaScript. Despite this, Dave Griswold believes that V8 could serve as the engine for other dynamic languages:
I think these properties will rapidly make V8 the dominant VM for dynamic languages. It ought to make a great platform for Smalltalk.
Google Gears has also moved into the Chromimum Project, as pointed out in the FAQ:
With Gears as a plug-in to Chromium we're carrying two copies of sqlite and two copies of V8. That's silly. We're integrating the code so Gears can run great in Chromium. We plan to continue to build Gears for other browsers out of the same code base.
Although Google Chrome supports plugins for content handling like Flash and PDF, it does not currently support browser extensions, although that is planned.
domingo, septiembre 07, 2008
Plex Beta 6.1
Model-Based Service DevelopmentHay otras mejoras, en Java, en Iseries, en el manejo del modelo. Pero esto es particularmente interesante.
This feature strengthens CA Plex support for SOA development by providing model-based service development capabilities directly in the product. Services are represented as objects within the Plex model, using the component modeling approach already established for COM and EJB objects.
WCF service generation is supported with this release and a plug-in architecture enables developers to create their own service generators.
WCF Service Generation
Windows Communication Foundation (WCF) is a new communication
subsystem within the Microsoft .NET Framework that unifies several different communication technologies such as web services, .NET remoting, message queuing, and so on.
The WCF service generation in CA Plex r6.1 enables you to present business logic as services based on WCF. This can include business logic developed in the Plex model and logic from third-party applications.
Service Wrappers and Cross-Platform Interoperability
The new WCF service generation is designed to support the convenient wrappering of existing applications as services. This includes Java, i5/OS, and .NET programs. Generally, this means that the target of a FNC implemented by FNC triple can be a Java or RPG function. In the case of RPG, the target function can correspond to an i5/OS program developed outside CA Plex, such
as i5/OS programs or programs developed with CA 2E.
miércoles, agosto 20, 2008
UML/DSL por Johan Den Haan (a propósito de Steven Kelly)
I definitely agree with Steven [...] that using UML and DSLs as presented by Cameron isn't a very good idea. I do however think, that the worlds of MDE (MDE is broader in scope than MDA, it adds multiple modeling dimensions and a software engineering process) and DSLs aren't opposites. I think that both DSLs and MDE are necessary assets for Model-Driven approaches. While multiple DSLs are needed to describe a software artifact (see for example the different architectural aspects of Service-Oriented Business Applications (SOBA) ), MDE is needed to provide a framework for connecting the different DSLs. An MDE methodology defines a framework of dimensions and their intersections, thereby defining the different models needed to describe a certain software application. This information also gives us the opportunity to discuss the needed DSL's in a (more or less) formal way. Last but not least, an MDE methodology also describes a software engineering process and a maintenance process, thereby defining the order in which models should be produced, how they are transformed into each other (if applicable) and how to change an existing software system using models.Es recomendable, saludable, recorrer todo el material de Johan.
Más adelante, otros enfoques...
domingo, agosto 17, 2008
Steven Kelly sobre Microsoft, UML, DSL
Lo de "explicar bien su naturaleza" no incluye su imágen de "poner el caballo adentro del carro"...Now things start to become a little clearer! The UML models are being used like MDA's PIMs, and the DSL models are the PSMs. The DSLs are thus not specific to the problem domain, as they should be, but to the solution domain: they have the implementation concepts of a particular Microsoft framework or library. (I've blogged earlier about the problems of such framework-based DSLs.) Putting UML before DSLs in this way isn't just putting the cart before the horse: it's putting the horse firmly into the cart -- and pulling it yourself.
What makes this all the more ironic is how eager Microsoft were to put the boot into UML and MDA back at the start of the DSL Tools project.
Steven retrotrae el debate a sus orígenes:
If you want to look back at calm, polite, reasoned discussions, try Microsoft's Alan Cameron Wills' and IBM's Simon Johnston's blog posts. If you want to see the big guns fighting it out with good old FUD-slinging, try Steve Cook vs. Grady Booch (Dec 3, 2004).Volveremos sobre esto (vacaciones mediante)...