993 resultados para A priori model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Boundary Element Method (BEM) is a discretisation technique for solving partial differential equations, which offers, for certain problems, important advantages over domain techniques. Despite the high CPU time reduction that can be achieved, some 3D problems remain today untreatable because the extremely large number of degrees of freedom—dof—involved in the boundary description. Model reduction seems to be an appealing choice for both, accurate and efficient numerical simulations. However, in the BEM the reduction in the number of degrees of freedom does not imply a significant reduction in the CPU time, because in this technique the more important part of the computing time is spent in the construction of the discrete system of equations. In this way, a reduction also in the number of weighting functions, seems to be a key point to render efficient boundary element simulations.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We propose a method for brain atlas deformation in the presence of large space-occupying tumors, based on an a priori model of lesion growth that assumes radial expansion of the lesion from its starting point. Our approach involves three steps. First, an affine registration brings the atlas and the patient into global correspondence. Then, the seeding of a synthetic tumor into the brain atlas provides a template for the lesion. The last step is the deformation of the seeded atlas, combining a method derived from optical flow principles and a model of lesion growth. Results show that a good registration is performed and that the method can be applied to automatic segmentation of structures and substructures in brains with gross deformation, with important medical applications in neurosurgery, radiosurgery, and radiotherapy.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Mémoire numérisé par la Division de la gestion de documents et des archives de l'Université de Montréal

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Flash floods pose a significant danger for life and property. Unfortunately, in arid and semiarid environment the runoff generation shows a complex non-linear behavior with a strong spatial and temporal non-uniformity. As a result, the predictions made by physically-based simulations in semiarid areas are subject to great uncertainty, and a failure in the predictive behavior of existing models is common. Thus better descriptions of physical processes at the watershed scale need to be incorporated into the hydrological model structures. For example, terrain relief has been systematically considered static in flood modelling at the watershed scale. Here, we show that the integrated effect of small distributed relief variations originated through concurrent hydrological processes within a storm event was significant on the watershed scale hydrograph. We model these observations by introducing dynamic formulations of two relief-related parameters at diverse scales: maximum depression storage, and roughness coefficient in channels. In the final (a posteriori) model structure these parameters are allowed to be both time-constant or time-varying. The case under study is a convective storm in a semiarid Mediterranean watershed with ephemeral channels and high agricultural pressures (the Rambla del Albujón watershed; 556 km 2 ), which showed a complex multi-peak response. First, to obtain quasi-sensible simulations in the (a priori) model with time-constant relief-related parameters, a spatially distributed parameterization was strictly required. Second, a generalized likelihood uncertainty estimation (GLUE) inference applied to the improved model structure, and conditioned to observed nested hydrographs, showed that accounting for dynamic relief-related parameters led to improved simulations. The discussion is finally broadened by considering the use of the calibrated model both to analyze the sensitivity of the watershed to storm motion and to attempt the flood forecasting of a stratiform event with highly different behavior.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Brain deformations induced by space-occupying lesions may result in unpredictable position and shape of functionally important brain structures. The aim of this study is to propose a method for segmentation of brain structures by deformation of a segmented brain atlas in presence of a space-occupying lesion. Our approach is based on an a priori model of lesion growth (MLG) that assumes radial expansion from a seeding point and involves three steps: first, an affine registration bringing the atlas and the patient into global correspondence; then, the seeding of a synthetic tumor into the brain atlas providing a template for the lesion; finally, the deformation of the seeded atlas, combining a method derived from optical flow principles and a model of lesion growth. The method was applied on two meningiomas inducing a pure displacement of the underlying brain structures, and segmentation accuracy of ventricles and basal ganglia was assessed. Results show that the segmented structures were consistent with the patient's anatomy and that the deformation accuracy of surrounding brain structures was highly dependent on the accurate placement of the tumor seeding point. Further improvements of the method will optimize the segmentation accuracy. Visualization of brain structures provides useful information for therapeutic consideration of space-occupying lesions, including surgical, radiosurgical, and radiotherapeutic planning, in order to increase treatment efficiency and prevent neurological damage.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND Satisfaction of physicians is a concern in the healthcare sector, and it requires a multi-dimensional questionnaire in Spanish which studies their high-order needs. The objectives of this study are to adapt the 4CornerSAT Questionnaire to measure career satisfaction of physicians and to evaluate its validity in our context. METHOD The 4CornerSAT Questionnaire was adapted into Spanish, validating it among physicians of hospitals in Andalusia, Spain. A confirmatory factor analysis (CFA) was performed to corroborate the a priori model, and it was evaluated the internal consistency and the construct validity through the Cronbach's alpha and the correlation between the scale and the global item, respectively. RESULTS The adapted questionnaire was administrated to 121 specialist physicians. The CFA corroborated the four dimensions of the questionnaire (χ2=114.64, df=94, p<0.07; χ2/df=1.22; RMSEA=0.04). The internal consistency obtained an α=0.92 and the correlation between the summed scale and the global item verified the construct validity (r=0.77; p<0.001). CONCLUSIONS The 4CornerSAT questionnaire was adapted to Spanish, identifying an adequate construct validity and internal consistency.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND Job satisfaction of nurses is a determinant factor in the quality and organizational adaptation of clinical management models in the current socio-economic context. The aim of this study was to construct and validate a questionnaire to measure job satisfaction of nurses in the Clinical Management Units in the Andalusian Public Health System. METHODS Clinimetric and cross-sectional study with a sample of 314 nurses of two university hospitals from Seville. Nurses were surveyed in 2011, from March to June. We used the Font Roja questionnaire adapted to our study variables. We performed analyses of correlations, reliability and construct validity, using exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) to test the a priori model. RESULTS The end questionnaire consists of 10 items, whose internal consistency was 0.75, with a percentage of variance explaining of 63.67%. CFA confirmed 4 dimensions (work environment, work relationships, motivation, and recognition): significant χ2 (p < .001); χ2/gl = 2.013; GFI= 0.958, RMR = 0.055 y RMSEA = 0.057; AGFI = 0.927, NFI = 0.878, TLI = 0.902, CFI =0.933 e IFI = 0.935; AIC = 132.486 y ECVI = 0.423. CONCLUSION This new questionnaire (G_Clinic) improves clinimetric values of the Font Roja questionnaire, because it reduces the number of items, improves the reliability of the dimensions, increases the value of variance explained, and allows knowing job satisfaction of nurses in clinical managementt.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Feature extraction is the part of pattern recognition, where the sensor data is transformed into a more suitable form for the machine to interpret. The purpose of this step is also to reduce the amount of information passed to the next stages of the system, and to preserve the essential information in the view of discriminating the data into different classes. For instance, in the case of image analysis the actual image intensities are vulnerable to various environmental effects, such as lighting changes and the feature extraction can be used as means for detecting features, which are invariant to certain types of illumination changes. Finally, classification tries to make decisions based on the previously transformed data. The main focus of this thesis is on developing new methods for the embedded feature extraction based on local non-parametric image descriptors. Also, feature analysis is carried out for the selected image features. Low-level Local Binary Pattern (LBP) based features are in a main role in the analysis. In the embedded domain, the pattern recognition system must usually meet strict performance constraints, such as high speed, compact size and low power consumption. The characteristics of the final system can be seen as a trade-off between these metrics, which is largely affected by the decisions made during the implementation phase. The implementation alternatives of the LBP based feature extraction are explored in the embedded domain in the context of focal-plane vision processors. In particular, the thesis demonstrates the LBP extraction with MIPA4k massively parallel focal-plane processor IC. Also higher level processing is incorporated to this framework, by means of a framework for implementing a single chip face recognition system. Furthermore, a new method for determining optical flow based on LBPs, designed in particular to the embedded domain is presented. Inspired by some of the principles observed through the feature analysis of the Local Binary Patterns, an extension to the well known non-parametric rank transform is proposed, and its performance is evaluated in face recognition experiments with a standard dataset. Finally, an a priori model where the LBPs are seen as combinations of n-tuples is also presented

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Gravity field parameters are usually determined from observations of the GRACE satellite mission together with arc-specific parameters in a generalized orbit determination process. When separating the estimation of gravity field parameters from the determination of the satellites’ orbits, correlations between orbit parameters and gravity field coefficients are ignored and the latter parameters are biased towards the a priori force model. We are thus confronted with a kind of hidden regularization. To decipher the underlying mechanisms, the Celestial Mechanics Approach is complemented by tools to modify the impact of the pseudo-stochastic arc-specific parameters on the normal equations level and to efficiently generate ensembles of solutions. By introducing a time variable a priori model and solving for hourly pseudo-stochastic accelerations, a significant reduction of noisy striping in the monthly solutions can be achieved. Setting up more frequent pseudo-stochastic parameters results in a further reduction of the noise, but also in a notable damping of the observed geophysical signals. To quantify the effect of the a priori model on the monthly solutions, the process of fixing the orbit parameters is replaced by an equivalent introduction of special pseudo-observations, i.e., by explicit regularization. The contribution of the thereby introduced a priori information is determined by a contribution analysis. The presented mechanism is valid universally. It may be used to separate any subset of parameters by pseudo-observations of a special design and to quantify the damage imposed on the solution.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Peer reviewed

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Peer reviewed

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Peer reviewed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce the Coupled Aerosol and Tracer Transport model to the Brazilian developments on the Regional Atmospheric Modeling System (CATT-BRAMS). CATT-BRAMS is an on-line transport model fully consistent with the simulated atmospheric dynamics. Emission sources from biomass burning and urban-industrial-vehicular activities for trace gases and from biomass burning aerosol particles are obtained from several published datasets and remote sensing information. The tracer and aerosol mass concentration prognostics include the effects of sub-grid scale turbulence in the planetary boundary layer, convective transport by shallow and deep moist convection, wet and dry deposition, and plume rise associated with vegetation fires in addition to the grid scale transport. The radiation parameterization takes into account the interaction between the simulated biomass burning aerosol particles and short and long wave radiation. The atmospheric model BRAMS is based on the Regional Atmospheric Modeling System (RAMS), with several improvements associated with cumulus convection representation, soil moisture initialization and surface scheme tuned for the tropics, among others. In this paper the CATT-BRAMS model is used to simulate carbon monoxide and particulate material (PM(2.5)) surface fluxes and atmospheric transport during the 2002 LBA field campaigns, conducted during the transition from the dry to wet season in the southwest Amazon Basin. Model evaluation is addressed with comparisons between model results and near surface, radiosondes and airborne measurements performed during the field campaign, as well as remote sensing derived products. We show the matching of emissions strengths to observed carbon monoxide in the LBA campaign. A relatively good comparison to the MOPITT data, in spite of the fact that MOPITT a priori assumptions imply several difficulties, is also obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Meta-analysis is increasingly being employed as a screening procedure in large-scale association studies to select promising variants for follow-up studies. However, standard methods for meta-analysis require the assumption of an underlying genetic model, which is typically unknown a priori. This drawback can introduce model misspecifications, causing power to be suboptimal, or the evaluation of multiple genetic models, which augments the number of false-positive associations, ultimately leading to waste of resources with fruitless replication studies. We used simulated meta-analyses of large genetic association studies to investigate naive strategies of genetic model specification to optimize screenings of genome-wide meta-analysis signals for further replication. Methods Different methods, meta-analytical models and strategies were compared in terms of power and type-I error. Simulations were carried out for a binary trait in a wide range of true genetic models, genome-wide thresholds, minor allele frequencies (MAFs), odds ratios and between-study heterogeneity (tau(2)). Results Among the investigated strategies, a simple Bonferroni-corrected approach that fits both multiplicative and recessive models was found to be optimal in most examined scenarios, reducing the likelihood of false discoveries and enhancing power in scenarios with small MAFs either in the presence or in absence of heterogeneity. Nonetheless, this strategy is sensitive to tau(2) whenever the susceptibility allele is common (MAF epsilon 30%), resulting in an increased number of false-positive associations compared with an analysis that considers only the multiplicative model. Conclusion Invoking a simple Bonferroni adjustment and testing for both multiplicative and recessive models is fast and an optimal strategy in large meta-analysis-based screenings. However, care must be taken when examined variants are common, where specification of a multiplicative model alone may be preferable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dispersion model with mixed boundary conditions uses a single parameter, the dispersion number, to describe the hepatic elimination of xenobiotics and endogenous substances. An implicit a priori assumption of the model is that the transit time density of intravascular indicators is approximated by an inverse Gaussian distribution. This approximation is limited in that the model poorly describes the tail part of the hepatic outflow curves of vascular indicators. A sum of two inverse Gaussian functions is proposed as ail alternative, more flexible empirical model for transit time densities of vascular references. This model suggests that a more accurate description of the tail portion of vascular reference curves yields an elimination rate constant (or intrinsic clearance) which is 40% less than predicted by the dispersion model with mixed boundary conditions. The results emphasize the need to accurately describe outflow curves in using them as a basis for determining pharmacokinetic parameters using hepatic elimination models. (C) 1997 Society for Mathematical Biology.