975 resultados para parametric duration models


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Most parametric software cost estimation models used today evolved in the late 70's and early 80's. At that time, the dominant software development techniques being used were the early 'structured methods'. Since then, several new systems development paradigms and methods have emerged, one being Jackson Systems Development (JSD). As current cost estimating methods do not take account of these developments, their non-universality means they cannot provide adequate estimates of effort and hence cost. In order to address these shortcomings two new estimation methods have been developed for JSD projects. One of these methods JSD-FPA, is a top-down estimating method, based on the existing MKII function point method. The other method, JSD-COCOMO, is a sizing technique which sizes a project, in terms of lines of code, from the process structure diagrams and thus provides an input to the traditional COCOMO method.The JSD-FPA method allows JSD projects in both the real-time and scientific application areas to be costed, as well as the commercial information systems applications to which FPA is usually applied. The method is based upon a three-dimensional view of a system specification as opposed to the largely data-oriented view traditionally used by FPA. The method uses counts of various attributes of a JSD specification to develop a metric which provides an indication of the size of the system to be developed. This size metric is then transformed into an estimate of effort by calculating past project productivity and utilising this figure to predict the effort and hence cost of a future project. The effort estimates produced were validated by comparing them against the effort figures for six actual projects.The JSD-COCOMO method uses counts of the levels in a process structure chart as the input to an empirically derived model which transforms them into an estimate of delivered source code instructions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We propose a mechanism for testing the theory of collapse models such as continuous spontaneous localization (CSL) by examining the parametric heating rate of a trapped nanosphere. The random localizations of the center-of-mass for a given particle predicted by the CSL model can be understood as a stochastic force embodying a source of heating for the nanosphere. We show that by utilising a Paul trap to levitate the particle and optical cooling, it is possible to reduce environmental decoher- ence to such a level that CSL dominates the dynamics and contributes the main source of heating. We show that this approach allows measurements to be made on the timescale of seconds, and that the free parameter λcsl which characterises the model ought to be testable to values as low as 10^{−12} Hz.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Radiocarbon dating and Bayesian chronological modelling, undertaken as part of the investigation by the Times of Their Lives project into the development of Late Neolithic settlement and pottery in Orkney, has provided precise new dating for the Grooved Ware settlement of Barnhouse, excavated in 1985–91. Previous understandings of the site and its pottery are presented. A Bayesian model based on 70 measurements on 62 samples (of which 50 samples are thought to date accurately the deposits from which they were recovered) suggests that the settlement probably began in the later 32nd century cal bc (with Houses 2, 9, 3 and perhaps 5a), possibly as a planned foundation. Structure 8 – a large, monumental structure that differs in character from the houses – was probably built just after the turn of the millennium. Varied house durations and replacements are estimated. House 2 went out of use before the end of the settlement, and Structure 8 was probably the last element to be abandoned, probably during the earlier 29th century cal bc. The Grooved Ware pottery from the site is characterised by small, medium-sized, and large vessels with incised and impressed decoration, including a distinctive, false-relief, wavy-line cordon motif. A considerable degree of consistency is apparent in many aspects of ceramic design and manufacture over the use-life of the settlement, the principal change being the appearance, from c. 3025–2975 cal bc, of large coarse ware vessels with uneven surfaces and thick applied cordons, and of the use of applied dimpled circular pellets. The circumstances of new foundation of settlement in the western part of Mainland are discussed, as well as the maintenance and character of the site. The pottery from the site is among the earliest Grooved Ware so far dated. Its wider connections are noted, as well as the significant implications for our understanding of the timing and circumstances of the emergence of Grooved Ware, and the role of material culture in social strategies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The service of a critical infrastructure, such as a municipal wastewater treatment plant (MWWTP), is taken for granted until a flood or another low frequency, high consequence crisis brings its fragility to attention. The unique aspects of the MWWTP call for a method to quantify the flood stage-duration-frequency relationship. By developing a bivariate joint distribution model of flood stage and duration, this study adds a second dimension, time, into flood risk studies. A new parameter, inter-event time, is developed to further illustrate the effect of event separation on the frequency assessment. The method is tested on riverine, estuary and tidal sites in the Mid-Atlantic region. Equipment damage functions are characterized by linear and step damage models. The Expected Annual Damage (EAD) of the underground equipment is further estimated by the parametric joint distribution model, which is a function of both flood stage and duration, demonstrating the application of the bivariate model in risk assessment. Flood likelihood may alter due to climate change. A sensitivity analysis method is developed to assess future flood risk by estimating flood frequency under conditions of higher sea level and stream flow response to increased precipitation intensity. Scenarios based on steady and unsteady flow analysis are generated for current climate, future climate within this century, and future climate beyond this century, consistent with the WWTP planning horizons. The spatial extent of flood risk is visualized by inundation mapping and GIS-Assisted Risk Register (GARR). This research will help the stakeholders of the critical infrastructure be aware of the flood risk, vulnerability, and the inherent uncertainty.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Historic vaulted masonry structures often need strengthening interventions that can effectively improve their structural performance, especially during seismic events, and at the same time respect the existing setting and the modern conservation requirements. In this context, the use of innovative materials such as fiber-reinforced composite materials has been shown as an effective solution that can satisfy both aspects. This work aims to provide insight into the computational modeling of a full-scale masonry vault strengthened by fiber-reinforced composite materials and analyze the influence of the arrangement of the reinforcement on the efficiency of the intervention. At first, a parametric model of a cross vault focusing on a realistic representation of its micro-geometry is proposed. Then numerical modeling, simulating the pushover analyses, of several barrel vaults reinforced with different reinforcement configurations is performed. Finally, the results are collected and discussed in terms of force-displacement curves obtained for each proposed configuration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Often in biomedical research, we deal with continuous (clustered) proportion responses ranging between zero and one quantifying the disease status of the cluster units. Interestingly, the study population might also consist of relatively disease-free as well as highly diseased subjects, contributing to proportion values in the interval [0, 1]. Regression on a variety of parametric densities with support lying in (0, 1), such as beta regression, can assess important covariate effects. However, they are deemed inappropriate due to the presence of zeros and/or ones. To evade this, we introduce a class of general proportion density, and further augment the probabilities of zero and one to this general proportion density, controlling for the clustering. Our approach is Bayesian and presents a computationally convenient framework amenable to available freeware. Bayesian case-deletion influence diagnostics based on q-divergence measures are automatic from the Markov chain Monte Carlo output. The methodology is illustrated using both simulation studies and application to a real dataset from a clinical periodontology study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of existing relations between Non-linear Phonology models' predictions about syllable weight (quantity) (specially, Hayes' 1995 parametric metrical Phonology) and syllable duration at phonetic level. The data considered here is extracted from Gramática do Português Falado Project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The zero-inflated negative binomial model is used to account for overdispersion detected in data that are initially analyzed under the zero-Inflated Poisson model A frequentist analysis a jackknife estimator and a non-parametric bootstrap for parameter estimation of zero-inflated negative binomial regression models are considered In addition an EM-type algorithm is developed for performing maximum likelihood estimation Then the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and some ways to perform global influence analysis are derived In order to study departures from the error assumption as well as the presence of outliers residual analysis based on the standardized Pearson residuals is discussed The relevance of the approach is illustrated with a real data set where It is shown that zero-inflated negative binomial regression models seems to fit the data better than the Poisson counterpart (C) 2010 Elsevier B V All rights reserved

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Leaf wetness duration (LWD) is a key parameter in agricultural meteorology since it is related to epidemiology of many important crops, controlling pathogen infection and development rates. Because LWD is not widely measured, several methods have been developed to estimate it from weather data. Among the models used to estimate LWD, those that use physical principles of dew formation and dew and/or rain evaporation have shown good portability and sufficiently accurate results, but their complexity is a disadvantage for operational use. Alternatively, empirical models have been used despite their limitations. The simplest empirical models use only relative humidity data. The objective of this study was to evaluate the performance of three RH-based empirical models to estimate LWD in four regions around the world that have different climate conditions. Hourly LWD, air temperature, and relative humidity data were obtained from Ames, IA (USA), Elora, Ontario (Canada), Florence, Toscany (Italy), and Piracicaba, Sao Paulo State (Brazil). These data were used to evaluate the performance of the following empirical LWD estimation models: constant RH threshold (RH >= 90%); dew point depression (DPD); and extended RH threshold (EXT_RH). Different performance of the models was observed in the four locations. In Ames, Elora and Piracicaba, the RH >= 90% and DPD models underestimated LWD, whereas in Florence these methods overestimated LWD, especially for shorter wet periods. When the EXT_RH model was used, LWD was overestimated for all locations, with a significant increase in the errors. In general, the RH >= 90% model performed best, presenting the highest general fraction of correct estimates (F(C)), between 0.87 and 0.92, and the lowest false alarm ratio (F(AR)), between 0.02 and 0.31. The use of specific thresholds for each location improved accuracy of the RH model substantially, even when independent data were used; MAE ranged from 1.23 to 1.89 h, which is very similar to errors obtained with published physical models for LWD estimation. Based on these results, we concluded that, if calibrated locally, LWD can be estimated with acceptable accuracy by RH above a specific threshold, and that the EXT_RH method was unsuitable for estimating LWD at the locations used in this study. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Leaf wetness duration (LWD) is related to plant disease occurrence and is therefore a key parameter in agrometeorology. As LWD is seldom measured at standard weather stations, it must be estimated in order to ensure the effectiveness of warning systems and the scheduling of chemical disease control. Among the models used to estimate LWD, those that use physical principles of dew formation and dew and/or rain evaporation have shown good portability and sufficiently accurate results for operational use. However, the requirement of net radiation (Rn) is a disadvantage foroperational physical models, since this variable is usually not measured over crops or even at standard weather stations. With the objective of proposing a solution for this problem, this study has evaluated the ability of four models to estimate hourly Rn and their impact on LWD estimates using a Penman-Monteith approach. A field experiment was carried out in Elora, Ontario, Canada, with measurements of LWD, Rn and other meteorological variables over mowed turfgrass for a 58 day period during the growing season of 2003. Four models for estimating hourly Rn based on different combinations of incoming solar radiation (Rg), airtemperature (T), relative humidity (RH), cloud cover (CC) and cloud height (CH), were evaluated. Measured and estimated hourly Rn values were applied in a Penman-Monteith model to estimate LWD. Correlating measured and estimated Rn, we observed that all models performed well in terms of estimating hourly Rn. However, when cloud data were used the models overestimated positive Rn and underestimated negative Rn. When only Rg and T were used to estimate hourly Rn, the model underestimated positive Rn and no tendency was observed for negative Rn. The best performance was obtained with Model I, which presented, in general, the smallest mean absolute error (MAE) and the highest C-index. When measured LWD was compared to the Penman-Monteith LWD, calculated with measured and estimated Rn, few differences were observed. Both precision and accuracy were high, with the slopes of the relationships ranging from 0.96 to 1.02 and R-2 from 0.85 to 0.92, resulting in C-indices between 0.87 and 0.93. The LWD mean absolute errors associated with Rn estimates were between 1.0 and 1.5h, which is sufficient for use in plant disease management schemes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective:To investigate the effects of bilateral, surgically induced functional inhibition of the subthalamic nucleus (STN) on general language, high level linguistic abilities, and semantic processing skills in a group of patients with Parkinson’s disease. Methods:Comprehensive linguistic profiles were obtained up to one month before and three months after bilateral implantation of electrodes in the STN during active deep brain stimulation (DBS) in five subjects with Parkinson’s disease (mean age, 63.2 years). Equivalent linguistic profiles were generated over a three month period for a non-surgical control cohort of 16 subjects with Parkinson’s disease (NSPD) (mean age, 64.4 years). Education and disease duration were similar in the two groups. Initial assessment and three month follow up performance profiles were compared within subjects by paired t tests. Reliability change indices (RCI), representing clinically significant alterations in performance over time, were calculated for each of the assessment scores achieved by the five STN-DBS cases and the 16 NSPD controls, relative to performance variability within a group of 16 non-neurologically impaired adults (mean age, 61.9 years). Proportions of reliable change were then compared between the STN-DBS and NSPD groups. Results:Paired comparisons within the STN-DBS group showed prolonged postoperative semantic processing reaction times for a range of word types coded for meanings and meaning relatedness. Case by case analyses of reliable change across language assessments and groups revealed differences in proportions of change over time within the STN-DBS and NSPD groups in the domains of high level linguistics and semantic processing. Specifically, when compared with the NSPD group, the STN-DBS group showed a proportionally significant (p

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The truncation errors associated with finite difference solutions of the advection-dispersion equation with first-order reaction are formulated from a Taylor analysis. The error expressions are based on a general form of the corresponding difference equation and a temporally and spatially weighted parametric approach is used for differentiating among the various finite difference schemes. The numerical truncation errors are defined using Peclet and Courant numbers and a new Sink/Source dimensionless number. It is shown that all of the finite difference schemes suffer from truncation errors. Tn particular it is shown that the Crank-Nicolson approximation scheme does not have second order accuracy for this case. The effects of these truncation errors on the solution of an advection-dispersion equation with a first order reaction term are demonstrated by comparison with an analytical solution. The results show that these errors are not negligible and that correcting the finite difference scheme for them results in a more accurate solution. (C) 1999 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The per iodic structure of business cycles suggests that significant asymmetries are present over different phases of the cycle. This paper uses markov regime-switching models with fixed and duration dependent transition probabilities to directly model expansions, contractions and durations in Australian GDP growth and unemployment growth. Evidence is found of significant asymmetry in growth rates across expansions and contractions for both series. GDP contractions exhibit duration dependence implying that as output recessions age the likelihood of switching into an expansion phase increases. Unemployment growth does not exhibit duration dependence in either phase. Evidence is also presented that non-linearities in unemployment growth are well explained by the asymmetries in the GDP growth cycle. The analysis suggests that recessions are periods of rapid and intense job destruction, that Australian unemployment tends to ratchet up in recessionary periods and, in contrast to US and UK studies, that shocks to Australian unemployment growth are more persistent in recessions than expansions. [E37 C5 C41].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of mathematical models have been used to describe percutaneous absorption kinetics. In general, most of these models have used either diffusion-based or compartmental equations. The object of any mathematical model is to a) be able to represent the processes associated with absorption accurately, b) be able to describe/summarize experimental data with parametric equations or moments, and c) predict kinetics under varying conditions. However, in describing the processes involved, some developed models often suffer from being of too complex a form to be practically useful. In this chapter, we attempt to approach the issue of mathematical modeling in percutaneous absorption from four perspectives. These are to a) describe simple practical models, b) provide an overview of the more complex models, c) summarize some of the more important/useful models used to date, and d) examine sonic practical applications of the models. The range of processes involved in percutaneous absorption and considered in developing the mathematical models in this chapter is shown in Fig. 1. We initially address in vitro skin diffusion models and consider a) constant donor concentration and receptor conditions, b) the corresponding flux, donor, skin, and receptor amount-time profiles for solutions, and c) amount- and flux-time profiles when the donor phase is removed. More complex issues, such as finite-volume donor phase, finite-volume receptor phase, the presence of an efflux. rate constant at the membrane-receptor interphase, and two-layer diffusion, are then considered. We then look at specific models and issues concerned with a) release from topical products, b) use of compartmental models as alternatives to diffusion models, c) concentration-dependent absorption, d) modeling of skin metabolism, e) role of solute-skin-vehicle interactions, f) effects of vehicle loss, a) shunt transport, and h) in vivo diffusion, compartmental, physiological, and deconvolution models. We conclude by examining topics such as a) deep tissue penetration, b) pharmacodynamics, c) iontophoresis, d) sonophoresis, and e) pitfalls in modeling.