956 resultados para Local Galerkin method
Resumo:
Stress proteins represent a group of highly conserved intracellular proteins that provide adaptation against cellular stress. The present study aims to elucidate the stress protein-mediated effects of local hyperthermia and systemic administration of monophosphoryl lipid A (MPL) on oxygenation, metabolism and survival in bilateral porcine random pattern buttock flaps. Preconditioning was achieved 24h prior to surgery by applying a heating blanket on the operative site (n = 5), by intravenous administration of MPL at a dosage of 35 microg/kg body weight (n = 5) or by combining the two (n = 5). The flaps were monitored with laser Doppler flowmetry, polarographic microprobes and microdialysis until 5h postoperatively. Semiquantitative immunohistochemistry was performed for heat shock protein 70 (HSP70), heat shock protein 32 (also termed haem oxygenase-1, HO-1), and inducible nitrc oxide synthase (iNOS). The administration of MPL increased the impaired microcirculatory blood flow in the proximal part of the flap and partial oxygen tension in the the distal part by approximately 100% each (both P<0.05), whereas both variables remained virtually unaffected by local heat preconditioning. Lactate/pyruvate (L/P) ratio and glycerol concentration (representing cell membrane disintegration) in the distal part of the flap gradually increased to values of approximately 500 mmol/l and approximately 350 micromol/l, respectively (both P<0.01), which was substantially attenuated by heat application (P<0.01 for L/P ratio and P<0.05 for glycerol) and combined preconditioning (P<0.01 for both variables), whereas the effect of MPL was less marked (not significant). Flap survival was increased from 56% (untreated animals) to 65% after MPL (not significant), 71% after heat application (P<0.05) and 78% after both methods of preconditioning (P<0.01). iNOS and HO-1 were upregulated after each method of preconditioning (P<0.05), whereas augmented HSP70 staining was only observed after heat application (P<0.05). We conclude that local hyperthermia is more effective in preventing flap necrosis than systemic MPL administration because of enhancing the cellular tolerance to hypoxic stress, which is possibly mediated by HSP70, whereas some benefit may be obtained with MPL due to iNOS and HO-1-mediated improvement in tissue oxygenation.
Resumo:
In a cross-cultural study perceptions of local people living in the surroundings of biosphere reserves in Switzerland and Ukraine were examined using the method of qualitative interviews. In the UNESCO Biosphere Entlebuch in Switzerland people stated that they hoped for a better regional economic development due to the existence of the biosphere reserve. However, at the same time people feared further restrictions regarding land-use. In the Carpathian Biosphere Reserve located in Transcarpathia/Ukraine people tended to connect certain conditions – such as the high price for wood – directly to the existence of the biosphere reserve, when in fact these conditions and the biosphere reserve were separate, parallel developments. In both case studies three key-categories influencing local residents’ perceptions and evaluations of biosphere reserves could be identified. These categories are (1) the economic situation, (2) the history of nature protection, and (3) the power balance between the involved stakeholders. Paying close attention to those three categories will help planners and managers of protected areas to better understand the reasoning of local residents for or against a biosphere reserve in their area.
Resumo:
The Pacaya volcanic complex is part of the Central American volcanic arc, which is associated with the subduction of the Cocos tectonic plate under the Caribbean plate. Located 30 km south of Guatemala City, Pacaya is situated on the southern rim of the Amatitlan Caldera. It is the largest post-caldera volcano, and has been one of Central America’s most active volcanoes over the last 500 years. Between 400 and 2000 years B.P, the Pacaya volcano had experienced a huge collapse, which resulted in the formation of horseshoe-shaped scarp that is still visible. In the recent years, several smaller collapses have been associated with the activity of the volcano (in 1961 and 2010) affecting its northwestern flanks, which are likely to be induced by the local and regional stress changes. The similar orientation of dry and volcanic fissures and the distribution of new vents would likely explain the reactivation of the pre-existing stress configuration responsible for the old-collapse. This paper presents the first stability analysis of the Pacaya volcanic flank. The inputs for the geological and geotechnical models were defined based on the stratigraphical, lithological, structural data, and material properties obtained from field survey and lab tests. According to the mechanical characteristics, three lithotechnical units were defined: Lava, Lava-Breccia and Breccia-Lava. The Hoek and Brown’s failure criterion was applied for each lithotechnical unit and the rock mass friction angle, apparent cohesion, and strength and deformation characteristics were computed in a specified stress range. Further, the stability of the volcano was evaluated by two-dimensional analysis performed by Limit Equilibrium (LEM, ROCSCIENCE) and Finite Element Method (FEM, PHASE 2 7.0). The stability analysis mainly focused on the modern Pacaya volcano built inside the collapse amphitheatre of “Old Pacaya”. The volcanic instability was assessed based on the variability of safety factor using deterministic, sensitivity, and probabilistic analysis considering the gravitational instability and the effects of external forces such as magma pressure and seismicity as potential triggering mechanisms of lateral collapse. The preliminary results from the analysis provide two insights: first, the least stable sector is on the south-western flank of the volcano; second, the lowest safety factor value suggests that the edifice is stable under gravity alone, and the external triggering mechanism can represent a likely destabilizing factor.
Resumo:
In power electronic basedmicrogrids, the computational requirements needed to implement an optimized online control strategy can be prohibitive. The work presented in this dissertation proposes a generalized method of derivation of geometric manifolds in a dc microgrid that is based on the a-priori computation of the optimal reactions and trajectories for classes of events in a dc microgrid. The proposed states are the stored energies in all the energy storage elements of the dc microgrid and power flowing into them. It is anticipated that calculating a large enough set of dissimilar transient scenarios will also span many scenarios not specifically used to develop the surface. These geometric manifolds will then be used as reference surfaces in any type of controller, such as a sliding mode hysteretic controller. The presence of switched power converters in microgrids involve different control actions for different system events. The control of the switch states of the converters is essential for steady state and transient operations. A digital memory look-up based controller that uses a hysteretic sliding mode control strategy is an effective technique to generate the proper switch states for the converters. An example dcmicrogrid with three dc-dc boost converters and resistive loads is considered for this work. The geometric manifolds are successfully generated for transient events, such as step changes in the loads and the sources. The surfaces corresponding to a specific case of step change in the loads are then used as reference surfaces in an EEPROM for experimentally validating the control strategy. The required switch states corresponding to this specific transient scenario are programmed in the EEPROM as a memory table. This controls the switching of the dc-dc boost converters and drives the system states to the reference manifold. In this work, it is shown that this strategy effectively controls the system for a transient condition such as step changes in the loads for the example case.
Resumo:
OBJECTIVE: In ictal scalp electroencephalogram (EEG) the presence of artefacts and the wide ranging patterns of discharges are hurdles to good diagnostic accuracy. Quantitative EEG aids the lateralization and/or localization process of epileptiform activity. METHODS: Twelve patients achieving Engel Class I/IIa outcome following temporal lobe surgery (1 year) were selected with approximately 1-3 ictal EEGs analyzed/patient. The EEG signals were denoised with discrete wavelet transform (DWT), followed by computing the normalized absolute slopes and spatial interpolation of scalp topography associated to detection of local maxima. For localization, the region with the highest normalized absolute slopes at the time when epileptiform activities were registered (>2.5 times standard deviation) was designated as the region of onset. For lateralization, the cerebral hemisphere registering the first appearance of normalized absolute slopes >2.5 times the standard deviation was designated as the side of onset. As comparison, all the EEG episodes were reviewed by two neurologists blinded to clinical information to determine the localization and lateralization of seizure onset by visual analysis. RESULTS: 16/25 seizures (64%) were correctly localized by the visual method and 21/25 seizures (84%) by the quantitative EEG method. 12/25 seizures (48%) were correctly lateralized by the visual method and 23/25 seizures (92%) by the quantitative EEG method. The McNemar test showed p=0.15 for localization and p=0.0026 for lateralization when comparing the two methods. CONCLUSIONS: The quantitative EEG method yielded significantly more seizure episodes that were correctly lateralized and there was a trend towards more correctly localized seizures. SIGNIFICANCE: Coupling DWT with the absolute slope method helps clinicians achieve a better EEG diagnostic accuracy.
Resumo:
In his compelling case study of local governance and community safety in the UK Thames Valley, Kevin Stenson makes several important contributions to the field of governmentality studies. While the paper’s merits are far-reaching, to this reader’s assessment they can be summarized in the following key areas: 1) Empirically, the article enhances our knowledge of the political economic transformation of a region otherwise overlooked in social science research ; 2) Conceptually, Stenson offers several theoretical and analytical refrains that, while becoming increasingly commonplace, are nonetheless still germane and rightly oriented to offer push back against otherwise totalizing, reified accounts of roll back/roll out neoliberalism. A welcomed new approach is offered as a corrective, The Realist Governmentality perspective, which emphasizes the interrelated and co-constitutive nature of politics, local culture, and habitus in processes related to the restructuring of social governance; 3) Methodologically, the paper makes a pitch for the ways in which finely grained, nuanced, mixed-method/ethnographic analyses have the potential to further problematize and recast a field of governmentality studies far too often dominated by discursive and textual approaches.
Resumo:
Background Finite element models of augmented vertebral bodies require a realistic modelling of the cement infiltrated region. Most methods published so far used idealized cement shapes or oversimplified material models for the augmented region. In this study, an improved, anatomy-specific, homogenized finite element method was developed and validated to predict the apparent as well as the local mechanical behavior of augmented vertebral bodies. Methods Forty-nine human vertebral body sections were prepared by removing the cortical endplates and scanned with high-resolution peripheral quantitative CT before and after injection of a standard and a low-modulus bone cement. Forty-one specimens were tested in compression to measure stiffness, strength and contact pressure distributions between specimens and loading-plates. From the remaining eight, fourteen cylindrical specimens were extracted from the augmented region and tested in compression to obtain material properties. Anatomy-specific finite element models were generated from the CT data. The models featured element-specific, density-fabric-based material properties, damage accumulation, real cement distributions and experimentally determined material properties for the augmented region. Apparent stiffness and strength as well as contact pressure distributions at the loading plates were compared between simulations and experiments. Findings The finite element models were able to predict apparent stiffness (R2 > 0.86) and apparent strength (R2 > 0.92) very well. Also, the numerically obtained pressure distributions were in reasonable quantitative (R2 > 0.48) and qualitative agreement with the experiments. Interpretation The proposed finite element models have proven to be an accurate tool for studying the apparent as well as the local mechanical behavior of augmented vertebral bodies.
Resumo:
Images of an object under different illumination are known to provide strong cues about the object surface. A mathematical formalization of how to recover the normal map of such a surface leads to the so-called uncalibrated photometric stereo problem. In the simplest instance, this problem can be reduced to the task of identifying only three parameters: the so-called generalized bas-relief (GBR) ambiguity. The challenge is to find additional general assumptions about the object, that identify these parameters uniquely. Current approaches are not consistent, i.e., they provide different solutions when run multiple times on the same data. To address this limitation, we propose exploiting local diffuse reflectance (LDR) maxima, i.e., points in the scene where the normal vector is parallel to the illumination direction (see Fig. 1). We demonstrate several noteworthy properties of these maxima: a closed-form solution, computational efficiency and GBR consistency. An LDR maximum yields a simple closed-form solution corresponding to a semi-circle in the GBR parameters space (see Fig. 2); because as few as two diffuse maxima in different images identify a unique solution, the identification of the GBR parameters can be achieved very efficiently; finally, the algorithm is consistent as it always returns the same solution given the same data. Our algorithm is also remarkably robust: It can obtain an accurate estimate of the GBR parameters even with extremely high levels of outliers in the detected maxima (up to 80 % of the observations). The method is validated on real data and achieves state-of-the-art results.
Resumo:
Over the last decade, a plethora of computer-aided diagnosis (CAD) systems have been proposed aiming to improve the accuracy of the physicians in the diagnosis of interstitial lung diseases (ILD). In this study, we propose a scheme for the classification of HRCT image patches with ILD abnormalities as a basic component towards the quantification of the various ILD patterns in the lung. The feature extraction method relies on local spectral analysis using a DCT-based filter bank. After convolving the image with the filter bank, q-quantiles are computed for describing the distribution of local frequencies that characterize image texture. Then, the gray-level histogram values of the original image are added forming the final feature vector. The classification of the already described patches is done by a random forest (RF) classifier. The experimental results prove the superior performance and efficiency of the proposed approach compared against the state-of-the-art.
Resumo:
BACKGROUND Record linkage of existing individual health care data is an efficient way to answer important epidemiological research questions. Reuse of individual health-related data faces several problems: Either a unique personal identifier, like social security number, is not available or non-unique person identifiable information, like names, are privacy protected and cannot be accessed. A solution to protect privacy in probabilistic record linkages is to encrypt these sensitive information. Unfortunately, encrypted hash codes of two names differ completely if the plain names differ only by a single character. Therefore, standard encryption methods cannot be applied. To overcome these challenges, we developed the Privacy Preserving Probabilistic Record Linkage (P3RL) method. METHODS In this Privacy Preserving Probabilistic Record Linkage method we apply a three-party protocol, with two sites collecting individual data and an independent trusted linkage center as the third partner. Our method consists of three main steps: pre-processing, encryption and probabilistic record linkage. Data pre-processing and encryption are done at the sites by local personnel. To guarantee similar quality and format of variables and identical encryption procedure at each site, the linkage center generates semi-automated pre-processing and encryption templates. To retrieve information (i.e. data structure) for the creation of templates without ever accessing plain person identifiable information, we introduced a novel method of data masking. Sensitive string variables are encrypted using Bloom filters, which enables calculation of similarity coefficients. For date variables, we developed special encryption procedures to handle the most common date errors. The linkage center performs probabilistic record linkage with encrypted person identifiable information and plain non-sensitive variables. RESULTS In this paper we describe step by step how to link existing health-related data using encryption methods to preserve privacy of persons in the study. CONCLUSION Privacy Preserving Probabilistic Record linkage expands record linkage facilities in settings where a unique identifier is unavailable and/or regulations restrict access to the non-unique person identifiable information needed to link existing health-related data sets. Automated pre-processing and encryption fully protect sensitive information ensuring participant confidentiality. This method is suitable not just for epidemiological research but also for any setting with similar challenges.
Resumo:
The long-term integrity of protected areas (PAs), and hence the maintenance of related ecosystem services (ES), are dependent on the support of local people. In the present study, local people's perceptions of ecosystem services from PAs and factors that govern local preferences for PAs are assessed. Fourteen study villages were randomly selected from three different protected forest areas and one control site along the southern coast of Côte d'Ivoire. Data was collected through a mixed-method approach, including qualitative semi-structured interviews and a household survey based on hypothetical choice scenarios. Local people's perceptions of ecosystem service provision was decrypted through qualitative content analysis, while the relation between people's preferences and potential factors that affect preferences were analyzed through multinomial models. This study shows that rural villagers do perceive a number of different ecosystem services as benefits from PAs in Côte d'Ivoire. The results based on quantitative data also suggest that local preferences for PAs and related ecosystem services are driven by PAs' management rules, age, and people's dependence on natural resources.
Resumo:
The focal point of this paper is to propose and analyze a P 0 discontinuous Galerkin (DG) formulation for image denoising. The scheme is based on a total variation approach which has been applied successfully in previous papers on image processing. The main idea of the new scheme is to model the restoration process in terms of a discrete energy minimization problem and to derive a corresponding DG variational formulation. Furthermore, we will prove that the method exhibits a unique solution and that a natural maximum principle holds. In addition, a number of examples illustrate the effectiveness of the method.
Resumo:
We present a novel surrogate model-based global optimization framework allowing a large number of function evaluations. The method, called SpLEGO, is based on a multi-scale expected improvement (EI) framework relying on both sparse and local Gaussian process (GP) models. First, a bi-objective approach relying on a global sparse GP model is used to determine potential next sampling regions. Local GP models are then constructed within each selected region. The method subsequently employs the standard expected improvement criterion to deal with the exploration-exploitation trade-off within selected local models, leading to a decision on where to perform the next function evaluation(s). The potential of our approach is demonstrated using the so-called Sparse Pseudo-input GP as a global model. The algorithm is tested on four benchmark problems, whose number of starting points ranges from 102 to 104. Our results show that SpLEGO is effective and capable of solving problems with large number of starting points, and it even provides significant advantages when compared with state-of-the-art EI algorithms.
Resumo:
Historically morphological features were used as the primary means to classify organisms. However, the age of molecular genetics has allowed us to approach this field from the perspective of the organism's genetic code. Early work used highly conserved sequences, such as ribosomal RNA. The increasing number of complete genomes in the public data repositories provides the opportunity to look not only at a single gene, but at organisms' entire parts list. ^ Here the Sequence Comparison Index (SCI) and the Organism Comparison Index (OCI), algorithms and methods to compare proteins and proteomes, are presented. The complete proteomes of 104 sequenced organisms were compared. Over 280 million full Smith-Waterman alignments were performed on sequence pairs which had a reasonable expectation of being related. From these alignments a whole proteome phylogenetic tree was constructed. This method was also used to compare the small subunit (SSU) rRNA from each organism and a tree constructed from these results. The SSU rRNA tree by the SCI/OCI method looks very much like accepted SSU rRNA trees from sources such as the Ribosomal Database Project, thus validating the method. The SCI/OCI proteome tree showed a number of small but significant differences when compared to the SSU rRNA tree and proteome trees constructed by other methods. Horizontal gene transfer does not appear to affect the SCI/OCI trees until the transferred genes make up a large portion of the proteome. ^ As part of this work, the Database of Related Local Alignments (DaRLA) was created and contains over 81 million rows of sequence alignment information. DaRLA, while primarily used to build the whole proteome trees, can also be applied shared gene content analysis, gene order analysis, and creating individual protein trees. ^ Finally, the standard BLAST method for analyzing shared gene content was compared to the SCI method using 4 spirochetes. The SCI system performed flawlessly, finding all proteins from one organism against itself and finding all the ribosomal proteins between organisms. The BLAST system missed some proteins from its respective organism and failed to detect small ribosomal proteins between organisms. ^
Resumo:
The Houston region is home to arguably the largest petrochemical and refining complex anywhere. The effluent of this complex includes many potentially hazardous compounds. Study of some of these compounds has led to recognition that a number of known and probable carcinogens are at elevated levels in ambient air. Two of these, benzene and 1,3-butadiene, have been found in concentrations which may pose health risk for residents of Houston.^ Recent popular journalism and publications by local research institutions has increased the interest of the public in Houston's air quality. Much of the literature has been critical of local regulatory agencies' oversight of industrial pollution. A number of citizens in the region have begun to volunteer with air quality advocacy groups in the testing of community air. Inexpensive methods exist for monitoring of ozone, particulate matter and airborne toxic ambient concentrations. This study is an evaluation of a technique that has been successfully applied to airborne toxics.^ This technique, solid phase microextraction (SPME), has been used to measure airborne volatile organic hydrocarbons at community-level concentrations. It is has yielded accurate and rapid concentration estimates at a relatively low cost per sample. Examples of its application to measurement of airborne benzene exist in the literature. None have been found for airborne 1,3-butadiene. These compounds were selected for an evaluation of SPME as a community-deployed technique, to replicate previous application to benzene, to expand application to 1,3-butadiene and due to the salience of these compounds in this community. ^ This study demonstrates that SPME is a useful technique for quantification of 1,3-butadiene at concentrations observed in Houston. Laboratory background levels precluded recommendation of the technique for benzene. One type of SPME fiber, 85 μm Carboxen/PDMS, was found to be a sensitive sampling device for 1,3-butadiene under temperature and humidity conditions common in Houston. This study indicates that these variables affect instrument response. This suggests the necessity of calibration within specific conditions of these variables. While deployment of this technique was less expensive than other methods of quantification of 1,3-butadiene, the complexity of calibration may exclude an SPME method from broad deployment by community groups.^