874 resultados para Filmic approach methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Object. The aim of this study was to describe the surgical anatomy of the mediobasal aspect of the temporal lobe and the supracerebellar transtentorial (SCTT) approach performed not with an opening, but with the resection of the tentorium, as an alternative route for the neurosurgical management of vascular and tumoral lesions arising from this region. Methods. Cadaveric specimens were used to illustrate the surgical anatomy of the mediobasal region of the temporal lobe. Demographic aspects, characteristics of lesions, clinical presentation, surgical results, follow-up findings, and outcomes were retrospectively reviewed for patients referred to receive the SCTT approach with tentorial resection. Results. Ten patients (83%) were female and 2 (17%) were male. Their ages ranged from 6 to 59 years (mean 34.5 +/- 15.8 years). All lesions (3 posterior cerebral artery aneurysms, 3 arteriovenous malformations, 3 cavernous malformations, and 3 tumors) were completely excluded or resected. After a mean follow-up period of 143 months (range 10-240 months), the mean postoperative Glasgow Outcome Scale score was 4.9. Conclusions. Knowledge of the surgical anatomy provides improvement for microsurgical approaches. The evolution from a small opening to a resection of the tentorium absolutely changed the exposure of the mediobasal aspect of the temporal lobe. The SCTT approach with tentorial resection is an excellent alternative route to the posterior part of mediobasal aspect of the temporal lobe, and it was enough to achieve the best neurosurgical management of tumoral and vascular lesions located in this area. (http://thejns.org/doi/abs/10.3171/2011.12.JNS111256)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past decades, all of the efforts at quantifying systems complexity with a general tool has usually relied on using Shannon's classical information framework to address the disorder of the system through the Boltzmann-Gibbs-Shannon entropy, or one of its extensions. However, in recent years, there were some attempts to tackle the quantification of algorithmic complexities in quantum systems based on the Kolmogorov algorithmic complexity, obtaining some discrepant results against the classical approach. Therefore, an approach to the complexity measure is proposed here, using the quantum information formalism, taking advantage of the generality of the classical-based complexities, and being capable of expressing these systems' complexity on other framework than its algorithmic counterparts. To do so, the Shiner-Davison-Landsberg (SDL) complexity framework is considered jointly with linear entropy for the density operators representing the analyzed systems formalism along with the tangle for the entanglement measure. The proposed measure is then applied in a family of maximally entangled mixed state.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conclusion: The extended retrolabyrinthine approach (RLA) is a safe and reliable approach for auditory brainstem placement in children. The surgical landmarks to reach cochlear nucleus are adequately exposed by this approach. Objective: To describe a new approach option for auditory brainstem implants (ABIs) in children, highlighting the anatomical landmarks to appropriately expose the foramen of Luschka. Methods: Three prelingually deafened children consecutively operated for ABIs via the RLA. Results: ABI placement via the RLA was successfully performed in all children without any further complications except multidirectional nystagmus in one child. The RLA we employed differed from that used for vestibular schwannoma only in the removal of the posterior semicircular canal. The lateral and superior semicircular canals and the vestibule remained intact, and there was no need to expose the dura of the internal auditory meatus. The jugular bulb was completely exposed to allow adequate visualization of the ninth cranial nerve and cerebellar flocculus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is an interest in studying latent variables (or latent traits). Usually such latent traits are assumed to be random variables and a convenient distribution is assigned to them. A very common choice for such a distribution has been the standard normal. Recently, Azevedo et al. [Bayesian inference for a skew-normal IRT model under the centred parameterization, Comput. Stat. Data Anal. 55 (2011), pp. 353-365] proposed a skew-normal distribution under the centred parameterization (SNCP) as had been studied in [R. B. Arellano-Valle and A. Azzalini, The centred parametrization for the multivariate skew-normal distribution, J. Multivariate Anal. 99(7) (2008), pp. 1362-1382], to model the latent trait distribution. This approach allows one to represent any asymmetric behaviour concerning the latent trait distribution. Also, they developed a Metropolis-Hastings within the Gibbs sampling (MHWGS) algorithm based on the density of the SNCP. They showed that the algorithm recovers all parameters properly. Their results indicated that, in the presence of asymmetry, the proposed model and the estimation algorithm perform better than the usual model and estimation methods. Our main goal in this paper is to propose another type of MHWGS algorithm based on a stochastic representation (hierarchical structure) of the SNCP studied in [N. Henze, A probabilistic representation of the skew-normal distribution, Scand. J. Statist. 13 (1986), pp. 271-275]. Our algorithm has only one Metropolis-Hastings step, in opposition to the algorithm developed by Azevedo et al., which has two such steps. This not only makes the implementation easier but also reduces the number of proposal densities to be used, which can be a problem in the implementation of MHWGS algorithms, as can be seen in [R.J. Patz and B.W. Junker, A straightforward approach to Markov Chain Monte Carlo methods for item response models, J. Educ. Behav. Stat. 24(2) (1999), pp. 146-178; R. J. Patz and B. W. Junker, The applications and extensions of MCMC in IRT: Multiple item types, missing data, and rated responses, J. Educ. Behav. Stat. 24(4) (1999), pp. 342-366; A. Gelman, G.O. Roberts, and W.R. Gilks, Efficient Metropolis jumping rules, Bayesian Stat. 5 (1996), pp. 599-607]. Moreover, we consider a modified beta prior (which generalizes the one considered in [3]) and a Jeffreys prior for the asymmetry parameter. Furthermore, we study the sensitivity of such priors as well as the use of different kernel densities for this parameter. Finally, we assess the impact of the number of examinees, number of items and the asymmetry level on the parameter recovery. Results of the simulation study indicated that our approach performed equally as well as that in [3], in terms of parameter recovery, mainly using the Jeffreys prior. Also, they indicated that the asymmetry level has the highest impact on parameter recovery, even though it is relatively small. A real data analysis is considered jointly with the development of model fitting assessment tools. The results are compared with the ones obtained by Azevedo et al. The results indicate that using the hierarchical approach allows us to implement MCMC algorithms more easily, it facilitates diagnosis of the convergence and also it can be very useful to fit more complex skew IRT models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work describes a methodology to simulate free surface incompressible multiphase flows. This novel methodology allows the simulation of multiphase flows with an arbitrary number of phases, each of them having different densities and viscosities. Surface and interfacial tension effects are also included. The numerical technique is based on the GENSMAC front-tracking method. The velocity field is computed using a finite-difference discretization of a modification of the NavierStokes equations. These equations together with the continuity equation are solved for the two-dimensional multiphase flows, with different densities and viscosities in the different phases. The governing equations are solved on a regular Eulerian grid, and a Lagrangian mesh is employed to track free surfaces and interfaces. The method is validated by comparing numerical with analytic results for a number of simple problems; it was also employed to simulate complex problems for which no analytic solutions are available. The method presented in this paper has been shown to be robust and computationally efficient. Copyright (c) 2012 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sera of a retrospective cohort (n = 41) composed of children with well characterized cow's milk allergy collected from multiple visits were analyzed using a protein microarray system measuring four classes of immunoglobulins. The frequency of the visits, age and gender distribution reflected real situation faced by the clinicians at a pediatric reference center for food allergy in 530 Paulo, Brazil. The profiling array results have shown that total IgG and IgA share similar specificity whilst IgM and in particular IgE are distantly related. The correlation of specificity of IgE and IgA is variable amongst the patients and this relationship cannot be used to predict atopy or the onset of tolerance to milk. The array profiling technique has corroborated the clinical selection criteria for this cohort albeit it clearly suggested that 4 out of the 41 patients might have allergies other than milk origin. There was also a good correlation between the array data and ImmunoCAP results, casein in particular. By using qualitative and quantitative multivariate analysis routines it was possible to produce validated statistical models to predict with reasonable accuracy the onset of tolerance to milk proteins. If expanded to larger study groups, the array profiling in combination with the multivariate techniques show potential to improve the prognostic of milk allergic patients. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The purpose of this study was to test the psychometric properties of the Neurobehavior Inventory (NBI) in a group of temporal lobe epilepsy (TLE) patients from a tertiary care center, correlating its scores with the presence of psychiatric symptoms. Methods: Clinical and sociodemographic data from ninety-six TLE outpatients were collected, and a neuropsychiatric evaluation was performed with the following instruments: Mini-Mental State Examination (MMSE), structured psychiatric interview (MINI-PLUS), Neurobehavior Inventory (NBI), and Hamilton Depression Rating Scale (HAM-D). Results: Some traits evaluated by the NBI showed adequate internal consistency (mean inter-item correlation between 0.2 and 0.4) and were frequent, such as religiosity (74%) and repetitiveness (60.4%). Principal component analysis showed three factors, named here as emotions (Factor 1), hyposexuality (Factor 2), and unusual ideas (Factor 3). Depressive symptoms on HAM-D showed a strong association with emotions and hyposexuality factors. When patients with left TLE and right TLE were compared, the former exhibited more sadness (p=0.017), and the latter, a greater tendency toward sense of personal destiny (p=0.028). Conclusion: Depression influences NBI scoring, mainly emotionality and hyposexuality traits. Neurobehavior Inventory subscales can be better interpreted with an appropriate evaluation of comorbid mood and anxiety disorders. Compromise in left temporal mesial structures is associated with increased tendency toward sad affect, whereas right temporal pathology is associated with increased beliefs in personal destiny. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Support for the adverse effect of high income inequality on population health has come from studies that focus on larger areas, such as the US states, while studies at smaller geographical areas (eg, neighbourhoods) have found mixed results. Methods We used propensity score matching to examine the relationship between income inequality and mortality rates across 96 neighbourhoods (distritos) of the municipality of Sao Paulo, Brazil. Results Prior to matching, higher income inequality distritos (Gini >= 0.25) had slightly lower overall mortality rates (2.23 per 10 000, 95% CI -23.92 to 19.46) compared to lower income inequality areas (Gini <0.25). After propensity score matching, higher inequality was associated with a statistically significant higher mortality rate (41.58 per 10 000, 95% CI 8.85 to 73.3). Conclusion In Sao Paulo, the more egalitarian communities are among some of the poorest, with the worst health profiles. Propensity score matching was used to avoid inappropriate comparisons between the health status of unequal (but wealthy) neighbourhoods versus equal (but poor) neighbourhoods. Our methods suggest that, with proper accounting of heterogeneity between areas, income inequality is associated with worse population health in Sao Paulo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Selective modulation of liver X receptor beta (LXR beta) has been recognized as an important approach to prevent or reverse the atherosclerotic process. In the present work, we have developed robust conformation-independent fragment-based quantitative structure-activity and structure-selectivity relationship models for a series of quinolines and cinnolines as potent modulators of the two LXR sub-types. The generated models were then used to predict the potency of an external test set and the predicted values were in good agreement with the experimental results, indicating the potential of the models for untested compounds. The final 2D molecular recognition patterns obtained were integrated to 3D structure-based molecular modeling studies to provide useful insights into the chemical and structural determinants for increased LXR beta binding affinity and selectivity. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many techniques have been proposed for root coverage. However, none of them presents predictable results in deep and wide recessions. Objective: The aim of this case series report is to describe an alternative technique for root coverage at sites showing deep recessions and attachment loss >4 mm at buccal sites. Material and Methods: Four patients presenting deep recession defects at buccal sites (>= 4 mm) were treated by the newly forming bone graft technique, which consists in the creation of an alveolar socket at edentulous ridge and transferring of granulation tissue present in this socket to the recession defect after 21 days. Clinical periodontal parameters, including recession depth (RD), probing depth (PD), clinical attachment level (CAL), bleeding on probing (BOP), plaque index (PI) and keratinized gingiva width (KGW) were evaluated by a single examiner immediately before surgery and at 1, 3, 6 and 9 months postoperatively. Results: All cases showed reduction in RD and PD, along with CAL gain, although no increase in KGW could be observed. These findings suggest that the technique could favor periodontal regeneration along with root coverage, especially in areas showing deep recessions and attachment loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Partial nephrectomy for small kidney tumors has increased in the last decades, and the approach to non-palpable endophytic tumors became a challenge, with larger chances of positive margins or complications. The aim of this study is to describe an alternative nephron-sparing approach for small endophytic kidney tumors through anatrophic nephrotomy. Patients and Methods: A retrospective analysis of patients undergoing partial nephrectomy at our institution was performed and the subjects with endophytic tumors treated with anatrophic nephrotomy were identified. Patient demographics, perioperative outcomes and oncological results were evaluated. Results: Among the partial nephrectomies performed for intraparenchymal tumors between 06/2006 and 06/2010, ten patients were submitted to anatrophic nephrotomy. The mean patient age was 42 yrs, and the mean tumor size was 2.3 cm. Mean warm ischemia time was 22.4 min and the histopathological analysis showed 80% of clear cell carcinomas. At a mean follow-up of 36 months, no significant creatinine changes or local or systemic recurrences were observed. Conclusion: The operative technique described is a safe and effective nephron-sparing option for complete removal of endophytic renal tumors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current scientific applications have been producing large amounts of data. The processing, handling and analysis of such data require large-scale computing infrastructures such as clusters and grids. In this area, studies aim at improving the performance of data-intensive applications by optimizing data accesses. In order to achieve this goal, distributed storage systems have been considering techniques of data replication, migration, distribution, and access parallelism. However, the main drawback of those studies is that they do not take into account application behavior to perform data access optimization. This limitation motivated this paper which applies strategies to support the online prediction of application behavior in order to optimize data access operations on distributed systems, without requiring any information on past executions. In order to accomplish such a goal, this approach organizes application behaviors as time series and, then, analyzes and classifies those series according to their properties. By knowing properties, the approach selects modeling techniques to represent series and perform predictions, which are, later on, used to optimize data access operations. This new approach was implemented and evaluated using the OptorSim simulator, sponsored by the LHC-CERN project and widely employed by the scientific community. Experiments confirm this new approach reduces application execution time in about 50 percent, specially when handling large amounts of data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sensor and actuator based on laminated piezocomposite shells have shown increasing demand in the field of smart structures. The distribution of piezoelectric material within material layers affects the performance of these structures; therefore, its amount, shape, size, placement, and polarization should be simultaneously considered in an optimization problem. In addition, previous works suggest the concept of laminated piezocomposite structure that includes fiber-reinforced composite layer can increase the performance of these piezoelectric transducers; however, the design optimization of these devices has not been fully explored yet. Thus, this work aims the development of a methodology using topology optimization techniques for static design of laminated piezocomposite shell structures by considering the optimization of piezoelectric material and polarization distributions together with the optimization of the fiber angle of the composite orthotropic layers, which is free to assume different values along the same composite layer. The finite element model is based on the laminated piezoelectric shell theory, using the degenerate three-dimensional solid approach and first-order shell theory kinematics that accounts for the transverse shear deformation and rotary inertia effects. The topology optimization formulation is implemented by combining the piezoelectric material with penalization and polarization model and the discrete material optimization, where the design variables describe the amount of piezoelectric material and polarization sign at each finite element, with the fiber angles, respectively. Three different objective functions are formulated for the design of actuators, sensors, and energy harvesters. Results of laminated piezocomposite shell transducers are presented to illustrate the method. Copyright (C) 2012 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A common interest in gene expression data analysis is to identify from a large pool of candidate genes the genes that present significant changes in expression levels between a treatment and a control biological condition. Usually, it is done using a statistic value and a cutoff value that are used to separate the genes differentially and nondifferentially expressed. In this paper, we propose a Bayesian approach to identify genes differentially expressed calculating sequentially credibility intervals from predictive densities which are constructed using the sampled mean treatment effect from all genes in study excluding the treatment effect of genes previously identified with statistical evidence for difference. We compare our Bayesian approach with the standard ones based on the use of the t-test and modified t-tests via a simulation study, using small sample sizes which are common in gene expression data analysis. Results obtained report evidence that the proposed approach performs better than standard ones, especially for cases with mean differences and increases in treatment variance in relation to control variance. We also apply the methodologies to a well-known publicly available data set on Escherichia coli bacterium.