982 resultados para parametric model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study we present a novel automated strategy for predicting infarct evolution, based on MR diffusion and perfusion images acquired in the acute stage of stroke. The validity of this methodology was tested on novel patient data including data acquired from an independent stroke clinic. Regions-of-interest (ROIs) defining the initial diffusion lesion and tissue with abnormal hemodynamic function as defined by the mean transit time (MTT) abnormality were automatically extracted from DWI/PI maps. Quantitative measures of cerebral blood flow (CBF) and volume (CBV) along with ratio measures defined relative to the contralateral hemisphere (r(a)CBF and r(a)CBV) were calculated for the MTT ROIs. A parametric normal classifier algorithm incorporating these measures was used to predict infarct growth. The mean r(a)CBF and r(a)CBV values for eventually infarcted MTT tissue were 0.70 +/-0.19 and 1.20 +/-0.36. For recovered tissue the mean values were 0.99 +/-0.25 and 1.87 +/-0.71, respectively. There was a significant difference between these two regions for both measures (P

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many occupational safety interventions, the objective is to reduce the injury incidence as well as the mean claims cost once injury has occurred. The claims cost data within a period typically contain a large proportion of zero observations (no claim). The distribution thus comprises a point mass at 0 mixed with a non-degenerate parametric component. Essentially, the likelihood function can be factorized into two orthogonal components. These two components relate respectively to the effect of covariates on the incidence of claims and the magnitude of claims, given that claims are made. Furthermore, the longitudinal nature of the intervention inherently imposes some correlation among the observations. This paper introduces a zero-augmented gamma random effects model for analysing longitudinal data with many zeros. Adopting the generalized linear mixed model (GLMM) approach reduces the original problem to the fitting of two independent GLMMs. The method is applied to evaluate the effectiveness of a workplace risk assessment teams program, trialled within the cleaning services of a Western Australian public hospital.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: This paper introduces the software EMMIX-GENE that has been developed for the specific purpose of a model-based approach to the clustering of microarray expression data, in particular, of tissue samples on a very large number of genes. The latter is a nonstandard problem in parametric cluster analysis because the dimension of the feature space (the number of genes) is typically much greater than the number of tissues. A feasible approach is provided by first selecting a subset of the genes relevant for the clustering of the tissue samples by fitting mixtures of t distributions to rank the genes in order of increasing size of the likelihood ratio statistic for the test of one versus two components in the mixture model. The imposition of a threshold on the likelihood ratio statistic used in conjunction with a threshold on the size of a cluster allows the selection of a relevant set of genes. However, even this reduced set of genes will usually be too large for a normal mixture model to be fitted directly to the tissues, and so the use of mixtures of factor analyzers is exploited to reduce effectively the dimension of the feature space of genes. Results: The usefulness of the EMMIX-GENE approach for the clustering of tissue samples is demonstrated on two well-known data sets on colon and leukaemia tissues. For both data sets, relevant subsets of the genes are able to be selected that reveal interesting clusterings of the tissues that are either consistent with the external classification of the tissues or with background and biological knowledge of these sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A model of iron carbonate (FeCO3) film growth is proposed, which is an extension of the recent mechanistic model of carbon dioxide (CO2) corrosion by Nesic, et al. In the present model, the film growth occurs by precipitation of iron carbonate once saturation is exceeded. The kinetics of precipitation is dependent on temperature and local species concentrations that are calculated by solving the coupled species transport equations. Precipitation tends to build up a layer of FeCO3 on the surface of the steel and reduce the corrosion rate. On the other hand, the corrosion process induces voids under the precipitated film, thus increasing the porosity and leading to a higher corrosion rate. Depending on the environmental parameters such as temperature, pH, CO2 partial pressure, velocity, etc., the balance of the two processes can lead to a variety of outcomes. Very protective films and low corrosion rates are predicted at high pH, temperature, CO2 partial pressure, and Fe2+ ion concentration due to formation of dense protective films as expected. The model has been successfully calibrated against limited experimental data. Parametric testing of the model has been done to gain insight into the effect of various environmental parameters on iron carbonate film formation. The trends shown in the predictions agreed well with the general understanding of the CO2 corrosion process in the presence of iron carbonate films. The present model confirms that the concept of scaling tendency is a good tool for predicting the likelihood of protective iron carbonate film formation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last years, several studies have been made aiming to assess the out-of-plane seismic response of unreinforced stone masonry structures. This fact led to the development of a wide variety of models and approaches, ranging from simple kinematic based analytical models up to complex numerical simulations. Nevertheless, for the sake of simplicity, the out-of-plane seismic response of a masonry wall pier may be obtained by means of a simple single-degree-of-freedom system while still providing good results. In fact, despite the assumptions associated with such a simple formulation, it is also true that the epistemic uncertainty inherent with the selection of appropriate input parameters in more complex models may render them truly ineffective. In this framework, this paper focuses on the study of the out-of-plane bending of unreinforced stone masonry walls (cantilevers) by proposing a simplified analytical approach based on the construction of a linearized four-branch model, which is used to characterize the linear and nonlinear response of such structural elements through an overturning moment-rotation relationship. The formulation of the four-branch model is presented and described in detail and the meaningful parameters used for its construction are obtained from a set of experimental laboratory tests performed on six full-scale unreinforced regular sacco stone masonry specimens. Moreover, a parametric analysis aiming to evaluate the effect of these parameters’ variation on the final configuration of the model is presented and critically discussed. Finally, the results obtained from the application of the developed four-branch model on real unreinforced regular sacco stone masonry walls are thoroughly analysed and the main conclusions obtained from its application are summarized.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work describes a model for the determination of the moment–rotation relationship of a cross section of fiber reinforced concrete (FRC) elements that also include longitudinal bars for the flexural reinforcement (R/FRC). Since a stress–crack width relationship (σ–w)(σ–w) is used to model the post-cracking behavior of a FRC, the σ–w directly obtained from tensile tests, or derived from inverse analysis applied to the results obtained in three-point notched beam bending tests, can be adopted in this approach. For a more realistic assessment of the crack opening, a bond stress versus slip relationship is assumed to simulate the bond between longitudinal bars and surrounding FRC. To simulate the compression behavior of the FRC, a shear friction model is adopted based on the physical interpretation of the post-peak compression softening behavior registered in experimental tests. By allowing the formation of a compressive FRC wedge delimited by shear band zones, the concept of concrete crushing failure mode in beams failing in bending is reinterpreted. By using the moment–rotation relationship, an algorithm was developed to determine the force–deflection response of statically determinate R/FRC elements. The model is described in detail and its good predictive performance is demonstrated by using available experimental data. Parametric studies were executed to evidence the influence of relevant parameters of the model on the serviceability and ultimate design conditions of R/FRC elements failing in bending.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing masonry structures are usually associated to a high seismic vulnerability, mainly due to the properties of the materials, weak connections between floors and load-bearing walls, high mass of the masonry walls and flexibility of the floors. For these reasons, the seismic performance of existing masonry structures has received much attention in the last decades. This study presents the parametric analysis taking into account the deviations on features of the gaioleiro buildings - Portuguese building typology. The main objective of the parametric analysis is to compare the seismic performance of the structure as a function of the variations of its properties with respect to the response of a reference model. The parametric analysis was carried out for two types of structural analysis, namely for the non-linear dynamic analysis with time integration and for the pushover analysis with distribution of forces proportional to the inertial forces of the structure. The Young's modulus of the masonry walls, Young's modulus of the timber floors, the compressive and tensile non-linear properties (strength and fracture energy) were the properties considered in both type of analysis. Additionally, in the dynamic analysis, the influences of the vis-cous damping and of the vertical component of the earthquake were evaluated. A pushover analysis proportional to the modal displacement of the first mode in each direction was also carried out. The results shows that the Young's modulus of the masonry walls, the Young's modulus of the timber floors and the compressive non-linear properties are the pa-rameters that most influence the seismic performance of this type of tall and weak existing masonry structures. Furthermore, it is concluded that that the stiffness of the floors influences significantly the strength capacity and the collapse mecha-nism of the numerical model. Thus, a study on the strengthening of the floors was also carried out. The increase of the thickness of the timber floors was the strengthening technique that presented the best seismic performance, in which the reduction of the out-of-plane displacements of the masonry walls is highlighted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A newly developed strain rate dependent anisotropic continuum model is proposed for impact and blast applications in masonry. The present model adopted the usual approach of considering different yield criteria in tension and compression. The analysis of unreinforced block work masonry walls subjected to impact is carried out to validate the capability of the model. Comparison of the numerical predictions and test data revealed good agreement. Next, a parametric study is conducted to evaluate the influence of the tensile strengths along the three orthogonal directions and of the wall thickness on the global behavior of masonry walls.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study proposes a dynamic constitutive material interface model that includes non-associated flow rule and high strain rate effects, implemented in the finite element code ABAQUS as a user subroutine. First, the model capability is validated with numerical simulations of unreinforced block work masonry walls subjected to low velocity impact. The results obtained are compared with field test data and good agreement is found. Subsequently, a comprehensive parametric analysis is accomplished with different joint tensile strengths and cohesion, and wall thickness to evaluate the effect of the parameter variations on the impact response of masonry walls.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim  Recently developed parametric methods in historical biogeography allow researchers to integrate temporal and palaeogeographical information into the reconstruction of biogeographical scenarios, thus overcoming a known bias of parsimony-based approaches. Here, we compare a parametric method, dispersal-extinction-cladogenesis (DEC), against a parsimony-based method, dispersal-vicariance analysis (DIVA), which does not incorporate branch lengths but accounts for phylogenetic uncertainty through a Bayesian empirical approach (Bayes-DIVA). We analyse the benefits and limitations of each method using the cosmopolitan plant family Sapindaceae as a case study.Location  World-wide.Methods  Phylogenetic relationships were estimated by Bayesian inference on a large dataset representing generic diversity within Sapindaceae. Lineage divergence times were estimated by penalized likelihood over a sample of trees from the posterior distribution of the phylogeny to account for dating uncertainty in biogeographical reconstructions. We compared biogeographical scenarios between Bayes-DIVA and two different DEC models: one with no geological constraints and another that employed a stratified palaeogeographical model in which dispersal rates were scaled according to area connectivity across four time slices, reflecting the changing continental configuration over the last 110 million years.Results  Despite differences in the underlying biogeographical model, Bayes-DIVA and DEC inferred similar biogeographical scenarios. The main differences were: (1) in the timing of dispersal events - which in Bayes-DIVA sometimes conflicts with palaeogeographical information, and (2) in the lower frequency of terminal dispersal events inferred by DEC. Uncertainty in divergence time estimations influenced both the inference of ancestral ranges and the decisiveness with which an area can be assigned to a node.Main conclusions  By considering lineage divergence times, the DEC method gives more accurate reconstructions that are in agreement with palaeogeographical evidence. In contrast, Bayes-DIVA showed the highest decisiveness in unequivocally reconstructing ancestral ranges, probably reflecting its ability to integrate phylogenetic uncertainty. Care should be taken in defining the palaeogeographical model in DEC because of the possibility of overestimating the frequency of extinction events, or of inferring ancestral ranges that are outside the extant species ranges, owing to dispersal constraints enforced by the model. The wide-spanning spatial and temporal model proposed here could prove useful for testing large-scale biogeographical patterns in plants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computed Tomography (CT) represents the standard imaging modality for tumor volume delineation for radiotherapy treatment planning of retinoblastoma despite some inherent limitations. CT scan is very useful in providing information on physical density for dose calculation and morphological volumetric information but presents a low sensitivity in assessing the tumor viability. On the other hand, 3D ultrasound (US) allows a highly accurate definition of the tumor volume thanks to its high spatial resolution but it is not currently integrated in the treatment planning but used only for diagnosis and follow-up. Our ultimate goal is an automatic segmentation of gross tumor volume (GTV) in the 3D US, the segmentation of the organs at risk (OAR) in the CT and the registration of both modalities. In this paper, we present some preliminary results in this direction. We present 3D active contour-based segmentation of the eye ball and the lens in CT images; the presented approach incorporates the prior knowledge of the anatomy by using a 3D geometrical eye model. The automated segmentation results are validated by comparing with manual segmentations. Then, we present two approaches for the fusion of 3D CT and US images: (i) landmark-based transformation, and (ii) object-based transformation that makes use of eye ball contour information on CT and US images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM) intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as "Montserrat-2000" event. The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered. For statistical evaluation of quantitative precipitation forecasts (QPFs), several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard "eyeball" analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA) analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts. High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new non parametric atlas registration framework, derived from the optical flow model and the active contour theory, applied to automatic subthalamic nucleus (STN) targeting in deep brain stimulation (DBS) surgery. In a previous work, we demonstrated that the STN position can be predicted based on the position of surrounding visible structures, namely the lateral and third ventricles. A STN targeting process can thus be obtained by registering these structures of interest between a brain atlas and the patient image. Here we aim to improve the results of the state of the art targeting methods and at the same time to reduce the computational time. Our simultaneous segmentation and registration model shows mean STN localization errors statistically similar to the most performing registration algorithms tested so far and to the targeting expert's variability. Moreover, the computational time of our registration method is much lower, which is a worthwhile improvement from a clinical point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A parametric procedure for the blind inversion of nonlinear channels is proposed, based on a recent method of blind source separation in nonlinear mixtures. Experiments show that the proposed algorithms perform efficiently, even in the presence of hard distortion. The method, based on the minimization of the output mutual information, needs the knowledge of log-derivative of input distribution (the so-called score function). Each algorithm consists of three adaptive blocks: one devoted to adaptive estimation of the score function, and two other blocks estimating the inverses of the linear and nonlinear parts of the channel, (quasi-)optimally adapted using the estimated score functions. This paper is mainly concerned by the nonlinear part, for which we propose two parametric models, the first based on a polynomial model and the second on a neural network, while [14, 15] proposed non-parametric approaches.