908 resultados para Capability Maturity Model for Software


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Creació d'una 'aplicació de gestió comercial i comptable partint de la ja existent en l’empresa Ph Systems S.L., seguint per tant la tecnologia utilitzada en aquesta aplicació, desenvolupada en Visual Basic 6 i utilitzant uns objectes i llibreries propis per facilitat l’enllaç dels formularis amb les taules de la base de dades utilitzant tant connexions ODBC com connexions OLE DB. El gestor de base de dades seleccionat per realitzar el projecte serà Oracle, que és el que actualment s’utilitza per les dades de l’aplicació de gestió d’estocs. Tasques a realitzar: Anàlisi dels requeriments del sistema. Disseny dels diferents mòduls de la futura aplicació. Obtenir una base de dades ben definida partint de la ja existent. Implementació dels mòduls de la futura aplicació. Els objectius d’aprenentatge d’aquest projecte són els següents: Millora del coneixement de la gestió d’una base de dades Oracle Millora en l’aprenentatge del llenguatge de programació Visual Basic 6

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En los textos de Empire y Multitude, Antonio Negri y Michael Hardt proponen que en el mundo actual la fuerza dominante que controla el capitalismo, y así el poder, es el Imperio. El Imperio obtiene su fuerza a través del control de la producción intelectual y su poder está ere cien - do durante este período de transición en el modelo capitalista. En este ensayo, se argumenta que los oprimidos por el Imperio, quienes conforman como clase la multitud, necesitan el software libre para crear su sueño: la democracia. Este software es a la vez el mejor ejemplo de como puede ser la democracia y una herramienta que permite la ampliación de ella. Además, su potencial en la región andina es todavía mayor por la debilidad del modelo de democracia liberal que promociona el Imperio.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data assimilation is a sophisticated mathematical technique for combining observational data with model predictions to produce state and parameter estimates that most accurately approximate the current and future states of the true system. The technique is commonly used in atmospheric and oceanic modelling, combining empirical observations with model predictions to produce more accurate and well-calibrated forecasts. Here, we consider a novel application within a coastal environment and describe how the method can also be used to deliver improved estimates of uncertain morphodynamic model parameters. This is achieved using a technique known as state augmentation. Earlier applications of state augmentation have typically employed the 4D-Var, Kalman filter or ensemble Kalman filter assimilation schemes. Our new method is based on a computationally inexpensive 3D-Var scheme, where the specification of the error covariance matrices is crucial for success. A simple 1D model of bed-form propagation is used to demonstrate the method. The scheme is capable of recovering near-perfect parameter values and, therefore, improves the capability of our model to predict future bathymetry. Such positive results suggest the potential for application to more complex morphodynamic models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aeolian mineral dust aerosol is an important consideration in the Earth's radiation budget as well as a source of nutrients to oceanic and land biota. The modelling of aeolian mineral dust has been improving consistently despite the relatively sparse observations to constrain them. This study documents the development of a new dust emissions scheme in the Met Office Unified ModelTM (MetUM) based on the Dust Entrainment and Deposition (DEAD) module. Four separate case studies are used to test and constrain the model output. Initial testing was undertaken on a large dust event over North Africa in March 2006 with the model constrained using AERONET data. The second case study involved testing the capability of the model to represent dust events in the Middle East without being re-tuned from the March 2006 case in the Sahara. While the model is unable to capture some of the daytime variation in AERONET AOD there is good agreement between the model and observed dust events. In the final two case studies new observations from in situ aircraft data during the Dust Outflow and Deposition to the Ocean (DODO) campaigns in February and August 2006 were used. These recent observations provided further data on dust size distributions and vertical profiles to constrain the model. The modelled DODO cases were also compared to AERONET data to make sure the radiative properties of the dust were comparable to observations. Copyright © 2009 Royal Meteorological Society and Crown Copyright

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consider the statement "this project should cost X and has risk of Y". Such statements are used daily in industry as the basis for making decisions. The work reported here is part of a study aimed at providing a rational and pragmatic basis for such statements. Of particular interest are predictions made in the requirements and early phases of projects. A preliminary model has been constructed using Bayesian Belief Networks and in support of this, a programme to collect and study data during the execution of various software development projects commenced in May 2002. The data collection programme is undertaken under the constraints of a commercial industrial regime of multiple concurrent small to medium scale software development projects. Guided by pragmatism, the work is predicated on the use of data that can be collected readily by project managers; including expert judgements, effort, elapsed times and metrics collected within each project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Aims: Using two parental clones of outcrossing Trifolium ambiguum as a potential model system, we examined how during seed development the maternal parent, number of seeds per pod, seed position within the pod, and pod position within the inflorescence influenced individual seed fresh weight, dry weight, water content, germinability, desiccation tolerance, hardseededness, and subsequent longevity of individual seeds. Methods: Near simultaneous, manual reciprocal crosses were carried out between clonal lines for two experiments. Infructescences were harvested at intervals during seed development. Each individual seed was weighed and then used to determine dry weight or one of the physiological behaviour traits. Key Results: Whilst population mass maturity was reached at 33–36 days after pollination (DAP), seed-to-seed variation in maximum seed dry weight, when it was achieved, and when maturation drying commenced, was considerable. Individual seeds acquired germinability between 14 and 44 DAP, desiccation tolerance between 30 and 40 DAP, and the capability to become hardseeded between 30 and 47 DAP. The time for viability to fall to 50 % (p50) at 60 % relative humidity and 45 °C increased between 36 and 56 DAP, when the seed coats of most individuals had become dark orange, but declined thereafter. Individual seed f. wt at harvest did not correlate with air-dry storage survival period. Analysing survival data for cohorts of seeds reduced the standard deviation of the normal distribution of seed deaths in time, but no sub-population showed complete uniformity of survival period. Conclusions: Variation in individual seed behaviours within a developing population is inherent and inevitable. In this outbreeder, there is significant variation in seed longevity which appears dependent on embryo genotype with little effect of maternal genotype or architectural factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past decade, a number of mechanistic, dynamic simulation models of several components of the dairy production system have become available. However their use has been limited due to the detailed technical knowledge and special software required to run them, and the lack of compatibility between models in predicting various metabolic processes in the animal. The first objective of the current study was to integrate the dynamic models of [Brit. J. Nutr. 72 (1994) 679] on rumen function, [J. Anim. Sci. 79 (2001) 1584] on methane production, [J. Anim. Sci. 80 (2002) 2481 on N partition, and a new model of P partition. The second objective was to construct a decision support system to analyse nutrient partition between animal and environment. The integrated model combines key environmental pollutants such as N, P and methane within a nutrient-based feed evaluation system. The model was run under different scenarios and the sensitivity of various parameters analysed. A comparison of predictions from the integrated model with the original simulation models showed an improvement in N excretion since the integrated model uses the dynamic model of [Brit. J. Nutr. 72 (1994) 6791 to predict microbial N, which was not represented in detail in the original model. The integrated model can be used to investigate the degree to which production and environmental objectives are antagonistic, and it may help to explain and understand the complex mechanisms involved at the ruminal and metabolic levels. A part of the integrated model outputs were the forms of N and P in excreta and methane, which can be used as indices of environmental pollution. (C) 2004 Elsevier B.V All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Population size estimation with discrete or nonparametric mixture models is considered, and reliable ways of construction of the nonparametric mixture model estimator are reviewed and set into perspective. Construction of the maximum likelihood estimator of the mixing distribution is done for any number of components up to the global nonparametric maximum likelihood bound using the EM algorithm. In addition, the estimators of Chao and Zelterman are considered with some generalisations of Zelterman’s estimator. All computations are done with CAMCR, a special software developed for population size estimation with mixture models. Several examples and data sets are discussed and the estimators illustrated. Problems using the mixture model-based estimators are highlighted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the need for accurate predictions on the fault inflow, i.e. the number of faults found in the consecutive project weeks, in highly iterative processes. In such processes, in contrast to waterfall-like processes, fault repair and development of new features run almost in parallel. Given accurate predictions on fault inflow, managers could dynamically re-allocate resources between these different tasks in a more adequate way. Furthermore, managers could react with process improvements when the expected fault inflow is higher than desired. This study suggests software reliability growth models (SRGMs) for predicting fault inflow. Originally developed for traditional processes, the performance of these models in highly iterative processes is investigated. Additionally, a simple linear model is developed and compared to the SRGMs. The paper provides results from applying these models on fault data from three different industrial projects. One of the key findings of this study is that some SRGMs are applicable for predicting fault inflow in highly iterative processes. Moreover, the results show that the simple linear model represents a valid alternative to the SRGMs, as it provides reasonably accurate predictions and performs better in many cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large scientific applications are usually developed, tested and used by a group of geographically dispersed scientists. The problems associated with the remote development and data sharing could be tackled by using collaborative working environments. There are various tools and software to create collaborative working environments. Some software frameworks, currently available, use these tools and software to enable remote job submission and file transfer on top of existing grid infrastructures. However, for many large scientific applications, further efforts need to be put to prepare a framework which offers application-centric facilities. Unified Air Pollution Model (UNI-DEM), developed by Danish Environmental Research Institute, is an example of a large scientific application which is in a continuous development and experimenting process by different institutes in Europe. This paper intends to design a collaborative distributed computing environment for UNI-DEM in particular but the framework proposed may also fit to many large scientific applications as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parametric software effort estimation models consisting on a single mathematical relationship suffer from poor adjustment and predictive characteristics in cases in which the historical database considered contains data coming from projects of a heterogeneous nature. The segmentation of the input domain according to clusters obtained from the database of historical projects serves as a tool for more realistic models that use several local estimation relationships. Nonetheless, it may be hypothesized that using clustering algorithms without previous consideration of the influence of well-known project attributes misses the opportunity to obtain more realistic segments. In this paper, we describe the results of an empirical study using the ISBSG-8 database and the EM clustering algorithm that studies the influence of the consideration of two process-related attributes as drivers of the clustering process: the use of engineering methodologies and the use of CASE tools. The results provide evidence that such consideration conditions significantly the final model obtained, even though the resulting predictive quality is of a similar magnitude.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new neurofuzzy model construction and parameter estimation algorithm from observed finite data sets, based on a Takagi and Sugeno (T-S) inference mechanism and a new extended Gram-Schmidt orthogonal decomposition algorithm, for the modeling of a priori unknown dynamical systems in the form of a set of fuzzy rules. The first contribution of the paper is the introduction of a one to one mapping between a fuzzy rule-base and a model matrix feature subspace using the T-S inference mechanism. This link enables the numerical properties associated with a rule-based matrix subspace, the relationships amongst these matrix subspaces, and the correlation between the output vector and a rule-base matrix subspace, to be investigated and extracted as rule-based knowledge to enhance model transparency. The matrix subspace spanned by a fuzzy rule is initially derived as the input regression matrix multiplied by a weighting matrix that consists of the corresponding fuzzy membership functions over the training data set. Model transparency is explored by the derivation of an equivalence between an A-optimality experimental design criterion of the weighting matrix and the average model output sensitivity to the fuzzy rule, so that rule-bases can be effectively measured by their identifiability via the A-optimality experimental design criterion. The A-optimality experimental design criterion of the weighting matrices of fuzzy rules is used to construct an initial model rule-base. An extended Gram-Schmidt algorithm is then developed to estimate the parameter vector for each rule. This new algorithm decomposes the model rule-bases via an orthogonal subspace decomposition approach, so as to enhance model transparency with the capability of interpreting the derived rule-base energy level. This new approach is computationally simpler than the conventional Gram-Schmidt algorithm for resolving high dimensional regression problems, whereby it is computationally desirable to decompose complex models into a few submodels rather than a single model with large number of input variables and the associated curse of dimensionality problem. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New construction algorithms for radial basis function (RBF) network modelling are introduced based on the A-optimality and D-optimality experimental design criteria respectively. We utilize new cost functions, based on experimental design criteria, for model selection that simultaneously optimizes model approximation, parameter variance (A-optimality) or model robustness (D-optimality). The proposed approaches are based on the forward orthogonal least-squares (OLS) algorithm, such that the new A-optimality- and D-optimality-based cost functions are constructed on the basis of an orthogonalization process that gains computational advantages and hence maintains the inherent computational efficiency associated with the conventional forward OLS approach. The proposed approach enhances the very popular forward OLS-algorithm-based RBF model construction method since the resultant RBF models are constructed in a manner that the system dynamics approximation capability, model adequacy and robustness are optimized simultaneously. The numerical examples provided show significant improvement based on the D-optimality design criterion, demonstrating that there is significant room for improvement in modelling via the popular RBF neural network.