Early configuration management tools provided version control for source code files. By versioning source code files, changes made by one developer to one file could be preserved. Other developers could take these versions, modify them, and create new versions. The ability to version files became a prerequisite for team development. Over time, SCM tools offered integrated defect tracking since testing during the software development lifecycle identified bugs that needed to be fixed. Next came workflow automation. Tool vendors added process models and workflow engines to their SCM tools so that an organization could model and execute its software development process. Process items could model the activities that are at the core of software development lifecycle, such as gathering requirements, specifying features and functionality, designing, coding, and testing. A workflow engine could move each process item though a sequence of states as the activity modeled by the process item progressed. Over time, process modeling capabilities in the tools evolved. Today, SCM tools can model complex development processes. SCM tool users have come to expect flexible and extensible process models that can be adapted easily as an organization’s processes change. In addition, SCM tools users now look for an integration between requirements management and core SCM features. The need to ensure that a software system satisfies an evolving set of requirements is what drives this integration. Customer expectations and government regulations now make traceability from requirements to source code files a requirement for the software development lifecycle.
Comentarios, discusiones, notas, sobre tendencias en el desarrollo de la tecnología informática, y la importancia de la calidad en la construcción de software.
viernes, abril 29, 2005
Las herramientas de configuración y el ciclo de vida del software
miércoles, abril 27, 2005
El modelo asincrónico: un artículo en Developer.com
domingo, abril 24, 2005
La evolución tecnológica está basada en criterios científicos?
Mr X apunta a OO en su crítica, dedicándole un artículo en particular, pero el criterio es ampliamente aplicable. No estamos ante un proceso de evolución racional, científica, sino ante un proceso de evolución darwinista, en el que la calidad de una idea es solo parte del asunto, y otra parte no menor es el tamaño del capital que respalde un concepto. Escrito en 2001, el contenido es aún más aplicable hoy, en que la aguda concentración del mercado de IT lleva a una situación en que la tecnología es casi totalmente empresa-dependiente. Por ejemplo, qué hace tan distinto al concepto de Software Factory, que no le permita actuar en común con MDA?...o más aún,
serían tan ácidas las críticas de Greenfield, Short y otros, en un escenario distinto?
En palabras de Mr X:
Some may suggest that the chaotic fluctuations are necessary for progress. Since the merit of ideas cannot be readily quantifiable, the only remaining choices are stagnation or trial-and-error to find the better products or technologies. Given that stagnation is not the way of America, the remaining choice is trial-and-error.I don't fully agree with that. In my opinion, the merit of new concepts, languages, and paradigms should be required to be exposed to tough and open scrutiny before it could be heavily promoted as an improvement. Of course, this cannot easily be enacted into law.
Thus, the only solution may be education about the hype process. If people realized that they are being bamboozled to forever ride the tech-hype treadmill, then people may be less likely to be suckered in. I should point out that it is not any evil plot by one group, just a side-effect of active capitalism. Rather than throw the baby out with the bath water, let's clean up the bath water a bit.
martes, abril 19, 2005
Popkin pasa a ser parte de Telelogic
...Algunos días después...(30 de mayo)
El comentario de www.methodsandtools.com supone que Popkin alcanzó un socio...
Telelogic in Acquisition Mood The Swedish company Telelogic has announced the acquisition of Popkin Software, the US based editor of System Architect for $ 45 million in cash. Popkin's revenues for 2004 were $ 19.1 million and it had currently 108 employees. Telelogic has also launched a bid to buy Focal Software. With 26 employees, Focal Software develops and sells web-based solutions for decision support in product development and project portfolio analysis. Founded in 1983, Telelogic has currently more than 750 employees. Its main products are DOOR, a tool for requirements management, TAU dedicated to design, implementation & tests and SYNERGY for configuration management. With these acquisitions, Telelogic completes its product portfolio and expand its market opportunities. In this relatively good year for software development tools vendors, you can expect other acquisitions or mergers. In this case, Jan Popkin has found for his company a partner that will allow him to develop its product.
sábado, abril 16, 2005
Las raíces de la Orientación a Objetos
Lahman describe a la vez cuáles fueron los puntos débiles del diseño estructurado, indicando especialmente dos, vinculados al mantenimiento del código estructurado: la modificación de variables de estado en puntos no determinados del código, y las dependencias jerárquicas:The first systemmatic attempt to eliminate hackers appeared in the form of Structured Programming that provided a collection of good practices for writing 3GL code. That was quickly followed by Structured Design and Structured Analysis, both of which introduced more abstract graphical representations of programs. The dominant design technique became top-down functional decomposition where the solution was started with a very simple and general statement of the problem solution and then one successively decomposed that solution into more detailed levels. Each statement of functionality was collected as a node in an inverted "tree" whose lowest leaves were logically indivisible.
The impact of SA/SD/SP was enormous. Defect rates dropped from 150/KLOC to 5/KLOC. In addition, productivity for large projects where multiple programmers had to coordiante efforts improved greatly. Instead of 1000 programmers working for 10 years to produce 1 MLOC, 200 programmers could do the same job in 2-5 years
Podríamos decir que aún hoy estas observaciones son ampliamente verificables, sin hablar de que (lo puedo decir por Argentina y Chile) podemos encontrar con cierta frecuencia sistemas construídos bajo éste paradigma, o documentados con el método. Si me extendiera un poco más, encontraría que existen centros de estudio que aún lo enseñan como método a aplicar...There were a lot of problems that led to the Maintainability Gap but they could be broadly categorized as having two root causes: uncontrolled access to state variables and hierarchical implementation dependencies. State variable access was primarily a defect problem as data was modified in unexpected ways at unexpected times during execution. That resulted in additional test and repair cycle time when one modified existing code because it was difficult to predict how changes would affect untouched code that happened to access the same data.
Hierarchical implementation dependencies resulted in the legnedary "spaghetti code". That was because the leaf nodes in the functional decpomposition tree were at a very fine level of abstraction -- essentially arithmetic or logical operators in the 3GL. It was simply too tedious to cobble together lengthy sequences of such atomic operations to do complex tasks. However, the higher-level nodes in the functional decomposition tree quite conveniently captured such sequences as descendants. Since this nodes were systemmatically derived they had defined functional semantics. That allowed them to be reused (i.e., accessed by "clients" in different parts of the application that happened to need the same sequence of leaf oeprations).
That sort of reuse through accessing higher-level functions was a boon to developers and led to the notion of "procedural development" because it made excellent use of the core characteristic of 3GLs, block structuring around procedures. The problem, though, was that the functional decomposition "tree" now became a lattice where each node potentially had multiple ancestors (clients) as well as multiple descendants. It was that fanout of dependency that led to spaghetti code.
The dependencies existed because in top-down functional decomposition the lower-level functions are extensions of their parent higher-level function. That is, the specification of the higher-level function included the specifcation of the lower-level functions. Thus any contract between the client and the higher-level function dependend upon the specification of the entire descedant tree of functions. So if one changed the specification of a lower-level function, the specification of all of its higher-level ancestors was also changed.
That was no problem so long as the access structure was a pure tree. That's because the change was probably triggered by a need to change the specification of a higher-level function and implementing the fix in the lower-level function was simply the easiest place to do it. However, when one has a lattice, the higher-level functions have multiple clients. If only one client wants the change, the other clients may be broken by the change. Worse, there can be a client at any level of ancestry in the tree, so the change may break clients that are not even direct clients of the original higher-level function. The result was a disaster for maintainability because every change for one client could potentially break a host of other clients. Fixing things to keep all clients happy often resulted in major surgery to the tree or very complex parameterization that complicated the functions.