908 resultados para Capability Maturity Model for Software
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
Software product lines (SPL) are diverse systems that are developed using a dual engineering process: (a)family engineering defines the commonality and variability among all members of the SPL, and (b) application engineering derives specific products based on the common foundation combined with a variable selection of features. The number of derivable products in an SPL can thus be exponential in the number of features. This inherent complexity poses two main challenges when it comes to modelling: Firstly, the formalism used for modelling SPLs needs to be modular and scalable. Secondly, it should ensure that all products behave correctly by providing the ability to analyse and verify complex models efficiently. In this paper we propose to integrate an established modelling formalism (Petri nets) with the domain of software product line engineering. To this end we extend Petri nets to Feature Nets. While Petri nets provide a framework for formally modelling and verifying single software systems, Feature Nets offer the same sort of benefits for software product lines. We show how SPLs can be modelled in an incremental, modular fashion using Feature Nets, provide a Feature Nets variant that supports modelling dynamic SPLs, and propose an analysis method for SPL modelled as Feature Nets. By facilitating the construction of a single model that includes the various behaviours exhibited by the products in an SPL, we make a significant step towards efficient and practical quality assurance methods for software product lines.
Resumo:
Dissertação de mestrado integrado em Civil Engineering
Resumo:
Publicado em "Information control in manufacturing 1998 : (INCOM'98) : advances in industrial engineering : a proceedings volume from the 9th IFAC Symposium, Nancy-Metz, France, 24-26 June 1998. Vol. 2"
Resumo:
A prevalência de pessoas que referem dor no complexo articular do ombro, com concomitante limitação na capacidade para realizar atividades da vida diária, é elevada. Estes níveis de prevalência sobrecarregam quer os utentes, como a própria sociedade. A evidência científica atual indicia a existência de uma relação entre as alterações da articulação escápulo-torácica e as patologias associadas à articulação gleno-umeral. A capacidade de quantificar, cinemática e cineticamente, as disfunções ao nível das articulações escápulo-torácica e gleno-umeral, é algo de enorme importância, quer para a comunidade biomecânica, como para a clínica. No decorrer dos trabalhos desta tese foi desenvolvido, através do software OpenSim, um modelo tridimensional músculo-esquelético do complexo articular do ombro que inclui a representação do tórax/coluna, clavícula, omoplata, úmero, rádio, cúbito e articulações que permitem os movimentos relativos desses segmentos, assim como, 16 músculos e 4 ligamentos. Com um total de 11 graus de liberdade, incluindo um novo modelo articular escápulo-torácico, os resultados demonstram que este é capaz de reconstruir de forma precisa e rápida os movimentos escápulo-torácicos e glenoumerais, recorrendo para tal, à cinemática inversa, e à dinâmica inversa e direta. Conta ainda com um método de transformação inovador para determinar, com base nas especificidades dos sujeitos, os locais de inserção muscular. As principais motivações subjacentes ao desenvolvimento desta tese foram contribuir para o aprofundar do atual conhecimento sobre as disfunções do complexo articular do ombro e, simultaneamente, proporcionar à comunidade clínica uma ferramenta biomecânica de livre acesso com o intuito de melhor suportar as decisões clínicas e dessa forma concorrer para uma prática mais efetiva.
Resumo:
Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.
Resumo:
The relative growth and morphological sexual maturity of Chasmagnathus granulatus Dana, 1851 are presented for the first time to a mangrove population. The crabs were obtained during low tide periods, in the mangrove of Jabaquara Beach, Paraty, Rio de Janeiro, Brazil. All crabs in intermolt stage were sexed and had their body parts measured as follows: body height (BH), carapace length (CL) and width (CW), major cheliped propodus height (PH) and length (PL) for each sex, gonopod length (GL) and abdomen width (AW) for males and females, respectively. The relative growth was described using the allometric equation y=ax b and the size at onset sexual maturity was achieved using the software Mature I. The size of specimens ranged from 4.1 mm to 39.5 mm CW. The growth pattern was different between sexes in the cheliped relationships; the relationships BH vs. CW evidenced positive allometry for juveniles; PL vs. CW and PH vs. CW positive allometry for most crabs except juvenile females; AW vs. CW and GL vs. CW evidenced positive allometry for juveniles and isometry for adults. The relationships that best indicated the change from the juvenile to the adult phase were PH vs. CW for males and AW vs. CW for females. The size in which 50% of males from this population are mature is at 19.7 mm of CW (F=144.14; p<0.05) and for females it is at 19.2 mm of CW (F=166.54; p<0.05). The sizes obtained in this mangrove population are larger than those from previous studies, that could be attributed to a species plasticity concerning the habitat structure.
Resumo:
In order to upgrade the reliability of xenodiagnosis, attention has been directed towards population dynamics of the parasite, with particular interest for the following factors: 1. Parasite density which by itself is not a research objective, but by giving an accurate portrayal of parasite development and multiplication, has been incorporated in screening of bugs for xenodiagnosis. 2. On the assumption that food availability might increase parasite density, bugs from xenodiagnosis have been refed at biweekly intervals on chicken blood. 3. Infectivity rates and positives harbouring large parasite yields were based on gut infections, in which the parasite population comprised of all developmental forms was more abundant and easier to detect than in fecal infections, thus minimizing the probability of recording false negatives. 4. Since parasite density, low in the first 15 days of infection, increases rapidly in the following 30 days, the interval of 45 days has been adopted for routine examination of bugs from xenodiagnosis. By following the enumerated measures, all aiming to reduce false negative cases, we are getting closer to a reliable xenodiagnostic procedure. Upgrading the efficacy of xenodiagnosis is also dependent on the xenodiagnostic agent. Of 9 investigated vector species, Panstrongylus megistus deserves top priority as a xenodiagnostic agent. Its extraordinary capability to support fast development and vigorous multiplication of the few parasites, ingested from the host with chronic Chagas' disease, has been revealed by the strikingly close infectivity rates of 91.2% vs. 96.4% among bugs engorged from the same host in the chronic and acute phase of the disease respectively (Table V), the latter comporting an estimated number of 12.3 x 10[raised to the power of 3] parasites in the circulation at the time of xenodiagnosis, as reported previously by the authors (1982).
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
Coltop3D is a software that performs structural analysis by using digital elevation model (DEM) and 3D point clouds acquired with terrestrial laser scanners. A color representation merging slope aspect and slope angle is used in order to obtain a unique code of color for each orientation of a local slope. Thus a continuous planar structure appears in a unique color. Several tools are included to create stereonets, to draw traces of discontinuities, or to compute automatically density stereonet. Examples are shown to demonstrate the efficiency of the method.
Resumo:
ABSTRACT: q-Space-based techniques such as diffusion spectrum imaging, q-ball imaging, and their variations have been used extensively in research for their desired capability to delineate complex neuronal architectures such as multiple fiber crossings in each of the image voxels. The purpose of this article was to provide an introduction to the q-space formalism and the principles of basic q-space techniques together with the discussion on the advantages as well as challenges in translating these techniques into the clinical environment. A review of the currently used q-space-based protocols in clinical research is also provided.
Resumo:
Given the urgence of a new paradigm in wireless digital trasmission which should allow for higher bit rate, lower latency and tigher delay constaints, it has been proposed to investigate the fundamental building blocks that at the circuital/device level, will boost the change towards a more efficient network architecture, with high capacity, higher bandwidth and a more satisfactory end user experience. At the core of each transciever, there are inherently analog devices capable of providing the carrier signal, the oscillators. It is strongly believed that many limitations in today's communication protocols, could be relieved by permitting high carrier frequency radio transmission, and having some degree of reconfigurability. This led us to studying distributed oscillator architectures which work in the microwave range and possess wideband tuning capability. As microvave oscillators are essentially nonlinear devices, a full nonlinear analyis, synthesis, and optimization had to be considered for their implementation. Consequently, all the most used nonlinear numerical techniques in commercial EDA software had been reviewed. An application of all the aforementioned techniques has been shown, considering a systems of three coupled oscillator ("triple push" oscillator) in which the stability of the various oscillating modes has been studied. Provided that a certain phase distribution is maintained among the oscillating elements, this topology permits a rise in the output power of the third harmonic; nevertheless due to circuit simmetry, "unwanted" oscillating modes coexist with the intenteded one. Starting with the necessary background on distributed amplification and distributed oscillator theory, the design of a four stage reverse mode distributed voltage controlled oscillator (DVCO) using lumped elments has been presented. All the design steps have been reported and for the first time a method for an optimized design with reduced variations in the output power has been presented. Ongoing work is devoted to model a wideband DVCO and to implement a frequency divider.
Resumo:
The identification of genetically homogeneous groups of individuals is a long standing issue in population genetics. A recent Bayesian algorithm implemented in the software STRUCTURE allows the identification of such groups. However, the ability of this algorithm to detect the true number of clusters (K) in a sample of individuals when patterns of dispersal among populations are not homogeneous has not been tested. The goal of this study is to carry out such tests, using various dispersal scenarios from data generated with an individual-based model. We found that in most cases the estimated 'log probability of data' does not provide a correct estimation of the number of clusters, K. However, using an ad hoc statistic DeltaK based on the rate of change in the log probability of data between successive K values, we found that STRUCTURE accurately detects the uppermost hierarchical level of structure for the scenarios we tested. As might be expected, the results are sensitive to the type of genetic marker used (AFLP vs. microsatellite), the number of loci scored, the number of populations sampled, and the number of individuals typed in each sample.
Resumo:
Aquest TFC vol fer una presentació del model CMM, presentar un sistema de gestió que cobreixi els seus requeriments a nivell 3, fer una proposta de desplegament i avaluar els costos i els beneficis de l'adopció del nou sistema de gestió per a una empresa dedicada al desenvolupament i el manteniment de programari. El resultat del treball és la memòria en si i els seus annexos.
Resumo:
La idea principal és crear un model d'una xarxa de telecomunicacions gestionable des d'un programari d'informació geogràfica (GIS). Es tracta de dissenyar una xarxa simple de fibra òptica similar a les que es fan per a la connexió directa de clients amb aquesta tecnologia.