955 resultados para Gordon, Matthew: Sociolinguistics : method and interpretation
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Research on image processing has shown that combining segmentation methods may lead to a solid approach to extract semantic information from different sort of images. Within this context, the Normalized Cut (NCut) is usually used as a final partitioning tool for graphs modeled in some chosen method. This work explores the Watershed Transform as a modeling tool, using different criteria of the hierarchical Watershed to convert an image into an adjacency graph. The Watershed is combined with an unsupervised distance learning step that redistributes the graph weights and redefines the Similarity matrix, before the final segmentation step using NCut. Adopting the Berkeley Segmentation Data Set and Benchmark as a background, our goal is to compare the results obtained for this method with previous work to validate its performance.
Resumo:
Aim: To use published literature and experts' opinion to investigate the clinical meaning and magnitude of changes in the Quality of Life (QOL) of groups of patients measured with the European Organisation for the Research and Treatment of Cancer Quality of Life Questionnaire Core 30 (EORTC QLQ-C30). Methods: An innovative method combining systematic review of published studies, expert opinions and meta-analysis was used to estimate large, medium, and small mean changes over time for QLQ-C30 scores. Results: Nine hundred and eleven papers were identified, leading to 118 relevant papers. One thousand two hundred and thirty two mean changes in QOL over time were combined in the meta-analysis, with timescales ranging from four days to five years. Guidelines were produced for trivial, small, and medium size classes, for each subscale and for improving and declining scores separately. Estimates for improvements were smaller than respective estimates for declines. Conclusions: These guidelines can be used to aid sample size calculations and interpretation of mean changes over time from groups of patients. Observed mean changes in the QLQ-C30 scores are generally small in most clinical situations, possibly due to response shift. Careful consideration is needed when planning studies where QOL changes over time are of primary interest; the timing of follow up, sample attrition, direction of QOL changes, and subscales of primary interest are key considerations. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
The frequency distribution of SNPs and haplotypes in the ABCB1, SLCO1B1 and SLCO1B3 genes varies largely among continental populations. This variation can lead to biases in pharmacogenetic studies conducted in admixed populations such as those from Brazil and other Latin American countries. The aim of this study was to evaluate the influence of self-reported colour, geographical origin and genomic ancestry on distributions of the ABCB1, SLCO1B1 and SLCO1B3 polymorphisms and derived haplotypes in admixed Brazilian populations. A total of 1039 healthy adults from the north, north-east, south-east and south of Brazil were recruited for this investigation. The c.388A>G (rs2306283), c.463C>A (rs11045819) and c.521T>C (rs4149056) SNPs in the SLCO1B1 gene and c.334T>G (rs4149117) and c.699G>A (rs7311358) SNPs in the SLCO1B3 gene were determined by Taqman 5'-nuclease assays. The ABCB1 c.1236C>T (rs1128503), c.2677G>T/A (rs2032582) and c.3435C>T (rs1045642) polymorphisms were genotyped using a previously described single-base extension/termination method. The results showed that genotype and haplotype distributions are highly variable among populations of the same self-reported colour and geographical region. However, genomic ancestry showed that these associations are better explained by a continuous variable. The influence of ancestry on the distribution of alleles and haplotype frequencies was more evident in variants with large differences in allele frequencies between European and African populations. Design and interpretation of pharmacogenetic studies using these transporter genes should include genomic controls to avoid spurious conclusions based on improper matching of study cohorts from Brazilian populations and other highly admixed populations.
Resumo:
[EN]In this paper we review the novel meccano method. We summarize the main stages (subdivision, mapping, optimization) of this automatic tetrahedral mesh generation technique and we concentrate the study to complex genus-zero solids. In this case, our procedure only requires a surface triangulation of the solid. A crucial consequence of our method is the volume parametrization of the solid to a cube. We construct volume T-meshes for isogeometric analysis by using this result. The efficiency of the proposed technique is shown with several examples. A comparison between the meccano method and standard mesh generation techniques is introduced.-1…
Resumo:
This thesis presents new methods to simulate systems with hydrodynamic and electrostatic interactions. Part 1 is devoted to computer simulations of Brownian particles with hydrodynamic interactions. The main influence of the solvent on the dynamics of Brownian particles is that it mediates hydrodynamic interactions. In the method, this is simulated by numerical solution of the Navier--Stokes equation on a lattice. To this end, the Lattice--Boltzmann method is used, namely its D3Q19 version. This model is capable to simulate compressible flow. It gives us the advantage to treat dense systems, in particular away from thermal equilibrium. The Lattice--Boltzmann equation is coupled to the particles via a friction force. In addition to this force, acting on {it point} particles, we construct another coupling force, which comes from the pressure tensor. The coupling is purely local, i.~e. the algorithm scales linearly with the total number of particles. In order to be able to map the physical properties of the Lattice--Boltzmann fluid onto a Molecular Dynamics (MD) fluid, the case of an almost incompressible flow is considered. The Fluctuation--Dissipation theorem for the hybrid coupling is analyzed, and a geometric interpretation of the friction coefficient in terms of a Stokes radius is given. Part 2 is devoted to the simulation of charged particles. We present a novel method for obtaining Coulomb interactions as the potential of mean force between charges which are dynamically coupled to a local electromagnetic field. This algorithm scales linearly, too. We focus on the Molecular Dynamics version of the method and show that it is intimately related to the Car--Parrinello approach, while being equivalent to solving Maxwell's equations with freely adjustable speed of light. The Lagrangian formulation of the coupled particles--fields system is derived. The quasi--Hamiltonian dynamics of the system is studied in great detail. For implementation on the computer, the equations of motion are discretized with respect to both space and time. The discretization of the electromagnetic fields on a lattice, as well as the interpolation of the particle charges on the lattice is given. The algorithm is as local as possible: Only nearest neighbors sites of the lattice are interacting with a charged particle. Unphysical self--energies arise as a result of the lattice interpolation of charges, and are corrected by a subtraction scheme based on the exact lattice Green's function. The method allows easy parallelization using standard domain decomposition. Some benchmarking results of the algorithm are presented and discussed.
Resumo:
The "sustainability" concept relates to the prolonging of human economic systems with as little detrimental impact on ecological systems as possible. Construction that exhibits good environmental stewardship and practices that conserve resources in a manner that allow growth and development to be sustained for the long-term without degrading the environment are indispensable in a developed society. Past, current and future advancements in asphalt as an environmentally sustainable paving material are especially important because the quantities of asphalt used annually in Europe as well as in the U.S. are large. The asphalt industry is still developing technological improvements that will reduce the environmental impact without affecting the final mechanical performance. Warm mix asphalt (WMA) is a type of asphalt mix requiring lower production temperatures compared to hot mix asphalt (HMA), while aiming to maintain the desired post construction properties of traditional HMA. Lowering the production temperature reduce the fuel usage and the production of emissions therefore and that improve conditions for workers and supports the sustainable development. Even the crumb-rubber modifier (CRM), with shredded automobile tires and used in the United States since the mid 1980s, has proven to be an environmentally friendly alternative to conventional asphalt pavement. Furthermore, the use of waste tires is not only relevant in an environmental aspect but also for the engineering properties of asphalt [Pennisi E., 1992]. This research project is aimed to demonstrate the dual value of these Asphalt Mixes in regards to the environmental and mechanical performance and to suggest a low environmental impact design procedure. In fact, the use of eco-friendly materials is the first phase towards an eco-compatible design but it cannot be the only step. The eco-compatible approach should be extended also to the design method and material characterization because only with these phases is it possible to exploit the maximum potential properties of the used materials. Appropriate asphalt concrete characterization is essential and vital for realistic performance prediction of asphalt concrete pavements. Volumetric (Mix design) and mechanical (Permanent deformation and Fatigue performance) properties are important factors to consider. Moreover, an advanced and efficient design method is necessary in order to correctly use the material. A design method such as a Mechanistic-Empirical approach, consisting of a structural model capable of predicting the state of stresses and strains within the pavement structure under the different traffic and environmental conditions, was the application of choice. In particular this study focus on the CalME and its Incremental-Recursive (I-R) procedure, based on damage models for fatigue and permanent shear strain related to the surface cracking and to the rutting respectively. It works in increments of time and, using the output from one increment, recursively, as input to the next increment, predicts the pavement conditions in terms of layer moduli, fatigue cracking, rutting and roughness. This software procedure was adopted in order to verify the mechanical properties of the study mixes and the reciprocal relationship between surface layer and pavement structure in terms of fatigue and permanent deformation with defined traffic and environmental conditions. The asphalt mixes studied were used in a pavement structure as surface layer of 60 mm thickness. The performance of the pavement was compared to the performance of the same pavement structure where different kinds of asphalt concrete were used as surface layer. In comparison to a conventional asphalt concrete, three eco-friendly materials, two warm mix asphalt and a rubberized asphalt concrete, were analyzed. The First Two Chapters summarize the necessary steps aimed to satisfy the sustainable pavement design procedure. In Chapter I the problem of asphalt pavement eco-compatible design was introduced. The low environmental impact materials such as the Warm Mix Asphalt and the Rubberized Asphalt Concrete were described in detail. In addition the value of a rational asphalt pavement design method was discussed. Chapter II underlines the importance of a deep laboratory characterization based on appropriate materials selection and performance evaluation. In Chapter III, CalME is introduced trough a specific explanation of the different equipped design approaches and specifically explaining the I-R procedure. In Chapter IV, the experimental program is presented with a explanation of test laboratory devices adopted. The Fatigue and Rutting performances of the study mixes are shown respectively in Chapter V and VI. Through these laboratory test data the CalME I-R models parameters for Master Curve, fatigue damage and permanent shear strain were evaluated. Lastly, in Chapter VII, the results of the asphalt pavement structures simulations with different surface layers were reported. For each pavement structure, the total surface cracking, the total rutting, the fatigue damage and the rutting depth in each bound layer were analyzed.
Resumo:
Die Röntgenabsorptionsspektroskopie (Extended X-ray absorption fine structure (EXAFS) spectroscopy) ist eine wichtige Methode zur Speziation von Schwermetallen in einem weiten Bereich von umweltrelevanten Systemen. Um Strukturparameter wie Koordinationszahl, Atomabstand und Debye-Waller Faktoren für die nächsten Nachbarn eines absorbierenden Atoms zu bestimmen, ist es für experimentelle EXAFS-Spektren üblich, unter Verwendung von Modellstrukturen einen „Least-Squares-Fit“ durchzuführen. Oft können verschiedene Modellstrukturen mit völlig unterschiedlicher chemischer Bedeutung die experimentellen EXAFS-Daten gleich gut beschreiben. Als gute Alternative zum konventionellen Kurven-Fit bietet sich das modifizierte Tikhonov-Regularisationsverfahren an. Ergänzend zur Tikhonov-Standardvariationsmethode enthält der in dieser Arbeit vorgestellte Algorithmus zwei weitere Schritte, nämlich die Anwendung des „Method of Separating Functionals“ und ein Iterationsverfahren mit Filtration im realen Raum. Um das modifizierte Tikhonov-Regularisationsverfahren zu testen und zu bestätigen wurden sowohl simulierte als auch experimentell gemessene EXAFS-Spektren einer kristallinen U(VI)-Verbindung mit bekannter Struktur, nämlich Soddyit (UO2)2SiO4 x 2H2O, untersucht. Die Leistungsfähigkeit dieser neuen Methode zur Auswertung von EXAFS-Spektren wird durch ihre Anwendung auf die Analyse von Proben mit unbekannter Struktur gezeigt, wie sie bei der Sorption von U(VI) bzw. von Pu(III)/Pu(IV) an Kaolinit auftreten. Ziel der Dissertation war es, die immer noch nicht voll ausgeschöpften Möglichkeiten des modifizierten Tikhonov-Regularisationsverfahrens für die Auswertung von EXAFS-Spektren aufzuzeigen. Die Ergebnisse lassen sich in zwei Kategorien einteilen. Die erste beinhaltet die Entwicklung des Tikhonov-Regularisationsverfahrens für die Analyse von EXAFS-Spektren von Mehrkomponentensystemen, insbesondere die Wahl bestimmter Regularisationsparameter und den Einfluss von Mehrfachstreuung, experimentell bedingtem Rauschen, etc. auf die Strukturparameter. Der zweite Teil beinhaltet die Speziation von sorbiertem U(VI) und Pu(III)/Pu(IV) an Kaolinit, basierend auf experimentellen EXAFS-Spektren, die mit Hilfe des modifizierten Tikhonov-Regularisationsverfahren ausgewertet und mit Hilfe konventioneller EXAFS-Analyse durch „Least-Squares-Fit“ bestätigt wurden.
Resumo:
The European External Action Service (EEAS or Service) is one of the most significant and most debated innovations introduced by the Lisbon Treaty. This analysis intends to explain the anomalous design of the EEAS in light of its function, which consists in the promotion of external action coherence. Coherence is a principle of the EU legal system, which requires synergy in the external actions of the Union and its Members. It can be enforced only through the coordination of European policy-makers' initiatives, by bridging the gap between the 'Communitarian' and intergovernmental approaches. This is the 'Union method' envisaged by A. Merkel: "coordinated action in a spirit of solidarity - each of us in the area for which we are responsible but all working towards the same goal". The EEAS embodies the 'Union method', since it is institutionally linked to both Union organs and Member States. It is also capable of enhancing synergy in policy management and promoting unity in international representation, since its field of action is delimited not by an abstract concern for institutional balance but by a pragmatic assessment of the need for coordination in each sector. The challenge is now to make sure that this pragmatic approach is applied with respect to all the activities of the Service, in order to reinforce its effectiveness. The coordination brought by the EEAS is in fact the only means through which a European foreign policy can come into being: the choice is not between the Community method and the intergovernmental method, but between a coordinated position and nothing at all.
Resumo:
The multi-target screening method described in this work allows the simultaneous detection and identification of 700 drugs and metabolites in biological fluids using a hybrid triple-quadrupole linear ion trap mass spectrometer in a single analytical run. After standardization of the method, the retention times of 700 compounds were determined and transitions for each compound were selected by a "scheduled" survey MRM scan, followed by an information-dependent acquisition using the sensitive enhanced product ion scan of a Q TRAP hybrid instrument. The identification of the compounds in the samples analyzed was accomplished by searching the tandem mass spectrometry (MS/MS) spectra against the library we developed, which contains electrospray ionization-MS/MS spectra of over 1,250 compounds. The multi-target screening method together with the library was included in a software program for routine screening and quantitation to achieve automated acquisition and library searching. With the help of this software application, the time for evaluation and interpretation of the results could be drastically reduced. This new multi-target screening method has been successfully applied for the analysis of postmortem and traffic offense samples as well as proficiency testing, and complements screening with immunoassays, gas chromatography-mass spectrometry, and liquid chromatography-diode-array detection. Other possible applications are analysis in clinical toxicology (for intoxication cases), in psychiatry (antidepressants and other psychoactive drugs), and in forensic toxicology (drugs and driving, workplace drug testing, oral fluid analysis, drug-facilitated sexual assault).
Resumo:
A new simple method for two-dimensional determination of optical density of macular pigment xanthophyll (ODx) in clinical routine is based on a single blue-reflection fundus image. Individual different vignetting is corrected by a shading function. For its construction, nodes are automatically found in structureless image regions. The influence of stray light in elderly crystalline lenses is compensated by a correction function that depends on age. The reproducibility of parameters in a one-wavelength reflection method determined for three subjects (47, 61, and 78 years old) was: maxODx = 6.3%, meanODx = 4.6%, volume = 6%, and area = 6% already before stray-light correction. ODx was comparable in pseudophakic and in an eye with a crystalline lens of the same 11 subjects after stray-light correction. Significant correlation in ODx was found between the one-wavelength reflection method and the two-wavelength autofluorescence method for pseudophakic and cataract eyes of 19 patients suffering from dry age-related macular degeneration (AMD) (R(2) = 0.855). In pseudophakic eyes, maxODx was significantly lower for dry AMD (n = 45) (ODx = 0.491±0.102 ODU) than in eyes with healthy fundus (n = 22) (ODx = 0.615±0.103 ODU) (p = 0.000033). Also in eyes with crystalline lens, maxODx was lower in AMD (n = 125) (ODx = 0.610±0.093 ODU) than in healthy subjects (n = 45) (ODx = 0.674±0.098 ODU) (p = 0.00019). No dependence on age was found in the pseudophakic eyes both of healthy subjects and AMD patients.
Resumo:
The G3, CBS-QB3, and CBS-APNO methods have been used to calculate ΔH and ΔG values for deprotonation of seventeen gas-phase reactions where the experimental values are reported to be accurate within one kcal/mol. For these reactions, the mean absolute deviation of these three methods from experiment is 0.84 to 1.26 kcal/mol, and the root-mean-square deviation for ΔG and ΔH is 1.43 and 1.49 kcal/mol for the CBS-QB3 method, 1.06 and 1.14 kcal/mol for the CBS-APNO method, and 1.16 and 1.28 for the G3 method. The high accuracy of these methods makes them reliable for calculating gas-phase deprotonation reactions, and allows them to serve as a valuable check on the accuracy of experimental data reported in the National Institutes of Standards and Technology database.
Resumo:
We investigated the feasibility of postmortem percutaneous needle biopsy (PNB) for obtaining pulmonary samples adequate for the study of pulmonary fat embolism (PFE). Samples of both lungs were obtained from 26 cadavers via two different methods: (i) PNB and (ii) the double-edged knife technique, the gold standard at our institute. After water storage and Sudan III staining, six forensic pathologists independently examined all samples for the presence and severity of PFE. The results were compared and analyzed in each case regarding the vitality of the PFE and its relationship to the cause of death. The results showed that PFE was almost identically diagnosed and graded on the samples obtained via both methods. The discrepancies between the two techniques did not affect the diagnoses of vitality or cause of death related to PFE. This study demonstrates the feasibility of the PNB sampling method for the diagnosis and interpretation of PFE in the postmortem setting.
Resumo:
BACKGROUND: Physiological data obtained with the pulmonary artery catheter (PAC) are susceptible to errors in measurement and interpretation. Little attention has been paid to the relevance of errors in hemodynamic measurements performed in the intensive care unit (ICU). The aim of this study was to assess the errors related to the technical aspects (zeroing and reference level) and actual measurement (curve interpretation) of the pulmonary artery occlusion pressure (PAOP). METHODS: Forty-seven participants in a special ICU training program and 22 ICU nurses were tested without pre-announcement. All participants had previously been exposed to the clinical use of the method. The first task was to set up a pressure measurement system for PAC (zeroing and reference level) and the second to measure the PAOP. RESULTS: The median difference from the reference mid-axillary zero level was - 3 cm (-8 to + 9 cm) for physicians and -1 cm (-5 to + 1 cm) for nurses. The median difference from the reference PAOP was 0 mmHg (-3 to 5 mmHg) for physicians and 1 mmHg (-1 to 15 mmHg) for nurses. When PAOP values were adjusted for the differences from the reference transducer level, the median differences from the reference PAOP values were 2 mmHg (-6 to 9 mmHg) for physicians and 2 mmHg (-6 to 16 mmHg) for nurses. CONCLUSIONS: Measurement of the PAOP is susceptible to substantial error as a result of practical mistakes. Comparison of results between ICUs or practitioners is therefore not possible.