910 resultados para Direct Analysis Method


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study aimed at apprehending and analyzing the perspective of Primary Health Care managers concerning nurses’ work in Children’s Health Surveillance in a city in São Paulo state. The study population consisted of eight professionals from different professional categories with direct activity in the city’s management of the population’s Health Surveillance. It is a descriptive, qualitative study. Data were collected by means of recorded semi-structured interviews. The framework used for data analysis was the thematic Content Analysis Method. The results were systematized into three themes: 1- Managers’ conceptualizations concerning Children’s Health Surveillance and its application in practice; 2- Managers’ perspectives concerning nurses’ work in Children’s Health Surveillance; 3- Qualification of Children’s Health Surveillance under the view of the municipal management. The conceptualizations concerning Children’s Health Surveillance that were apprehended showed to be convergent as they indicated this model’s appropriateness to identify and prioritize children’s care in vulnerability conditions in the territory where they live. However, some managers did not include, in their statements, health promotion aspects as one of the cornerstones of their managerial action. Nurses were considered to be fundamental in the Children’s Health Surveillance process due to their competencies and responsibilities undertaken in this health provision level. The main difficulties for adequate implementation of Children’s Health Surveillance in Primary Health Care and the proposal to overcome them were pointed out. It was concluded that, under the managers’ perspectives, nurses can greatly contribute to Children’s Health Surveillance in Primary Health Care as members of the health care team; however, to that end, they need professional qualification, structural conditions and institutional support with that regard

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The importance of mechanical aspects related to cell activity and its environment is becoming more evident due to their influence in stem cell differentiation and in the development of diseases such as atherosclerosis. The mechanical tension homeostasis is related to normal tissue behavior and its lack may be related to the formation of cancer, which shows a higher mechanical tension. Due to the complexity of cellular activity, the application of simplified models may elucidate which factors are really essential and which have a marginal effect. The development of a systematic method to reconstruct the elements involved in the perception of mechanical aspects by the cell may accelerate substantially the validation of these models. This work proposes the development of a routine capable of reconstructing the topology of focal adhesions and the actomyosin portion of the cytoskeleton from the displacement field generated by the cell on a flexible substrate. Another way to think of this problem is to develop an algorithm to reconstruct the forces applied by the cell from the measurements of the substrate displacement, which would be characterized as an inverse problem. For these kind of problems, the Topology Optimization Method (TOM) is suitable to find a solution. TOM is consisted of an iterative application of an optimization method and an analysis method to obtain an optimal distribution of material in a fixed domain. One way to experimentally obtain the substrate displacement is through Traction Force Microscopy (TFM), which also provides the forces applied by the cell. Along with systematically generating the distributions of focal adhesion and actin-myosin for the validation of simplified models, the algorithm also represents a complementary and more phenomenological approach to TFM. As a first approximation, actin fibers and flexible substrate are represented through two-dimensional linear Finite Element Method. Actin contraction is modeled as an initial stress of the FEM elements. Focal adhesions connecting actin and substrate are represented by springs. The algorithm was applied to data obtained from experiments regarding cytoskeletal prestress and micropatterning, comparing the numerical results to the experimental ones

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper,we present a novel texture analysis method based on deterministic partially self-avoiding walks and fractal dimension theory. After finding the attractors of the image (set of pixels) using deterministic partially self-avoiding walks, they are dilated in direction to the whole image by adding pixels according to their relevance. The relevance of each pixel is calculated as the shortest path between the pixel and the pixels that belongs to the attractors. The proposed texture analysis method is demonstrated to outperform popular and state-of-the-art methods (e.g. Fourier descriptors, occurrence matrix, Gabor filter and local binary patterns) as well as deterministic tourist walk method and recent fractal methods using well-known texture image datasets.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Das Ziel des Experiments NA48 am CERN ist die Messung des Parameters Re(epsilon'/epsilon) der direktenCP-Verletzung mit einer Genauigkeit von 2x10^-4. Experimentell zugänglich ist das DoppelverhältnisR, das aus den Zerfällen des KL und KS in zwei neutrale bzw. zwei geladene Pionengebildet wird. Für R gilt in guter Näherung: R=1-6Re(epsilon'/epsilon).
NA48 verwendet eine Wichtung der KL-Ereignisse zur Reduzierung der Sensitivität auf dieDetektorakzeptanz. Zur Kontrolle der bisherigen Standardanalyse wurde eine Analyse ohne Ereigniswichtung durchgeführt. Das Ergebnis derungewichteten Analyse wird in dieser Arbeit vorgestellt. Durch Verzicht auf die Ereigniswichtung kann derstatistische Anteil des Gesamtfehlers deutlich verringert werden. Da der limitierende Kanal der Zerfall deslanglebigen Kaons in zwei neutrale Pionen ist, ist die Verwendung der gesamten Anzahl derKL-Zerfälle ein lohnendes Ziel.Im Laufe dieser Arbeit stellte sich heraus, dass dersystematische Fehler der Akzeptanzkorrektur diesen Gewinn wieder aufhebt.

Das Ergebnis der Arbeit für die Daten aus den Jahren 1998und 1999 ohne Ereigniswichtung lautet
Re(epsilon'/epsilon)=(17,91+-4,41(syst.)+-1,36(stat.))x10^-4.
Damit ist eindeutig die Existenz der direkten CP-Verletzungbestätigt. Dieses Ergebnis ist mit dem veröffentlichten Ergebnis vonNA48 verträglichSomit ist der Test der bisherigen Analysestrategie bei NA48erfolgreich durchgeführt worden.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Im Juli 2009 wurde am Mainzer Mikrotron (MAMI) erstmal ein Experiment durchgeführt, bei dem ein polarisiertes 3He Target mit Photonen im Energiebereich von 200 bis 800 MeV untersucht wurde. Das Ziel dieses Experiments war die Überprüfung der Gerasimov-Drell-Hearn Summenregel am Neutron. Die Verwendung der Messdaten welche mit dem polarisierten 3He Target gewonnen wurden, geben - im Vergleich mit den bereits existieren Daten vom Deuteron - aufgrund der Spin-Struktur des 3He einen komplementären und direkteren Zugang zum Neutron. Die Messung des totalen helizitätsabhängigen Photoabsorptions-Wirkungsquerschnitts wurde mittels eines energiemarkierten Strahls von zirkular polarisierten Photonen, welcher auf das longitudinal polarisierte 3He Target trifft, durchgeführt. Als Produktdetektoren kamen der Crystal Ball (4π Raumabdeckung), TAPS (als ”Vorwärtswand”) sowie ein Schwellen-Cherenkov-Detektor (online Veto zur Reduktion von elektromagnetischen Ereignissen) zum Einsatz. Planung und Aufbau der verschiedenen komponenten Teile des 3He Experimentaufbaus war ein entscheidender Teil dieser Dissertation und wird detailliert in der vorliegenden Arbeit beschrieben. Das Detektorsystem als auch die Analyse-Methoden wurden durch die Messung des unpolarisierten, totalen und inklusiven Photoabsoprtions-Wirkungsquerschnitts an flüssigem Wasserstoff getestet. Hierbei zeigten die Ergebnisse eine gute Übereinstimmung mit bereits zuvor publizierten Daten. Vorläufige Ergebnisse des unpolarisierten totalen Photoabsorptions-Wirkungsquerschnitts sowie der helizitätsabhängige Unterschied zwischen Photoabsorptions-Wirkungsquerschnitten an 3He im Vergleich zu verschiedenen theoretischen Modellen werden vorgestellt.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

L’attuale rilevanza rappresentata dalla stretta relazione tra cambiamenti climatici e influenza antropogenica ha da tempo posto l’attenzione sull’effetto serra e sul surriscaldamento planetario così come sull’aumento delle concentrazioni atmosferiche dei gas climaticamente attivi, in primo luogo la CO2. Il radiocarbonio è attualmente il tracciante ambientale per eccellenza in grado di fornire mediante un approccio “top-down” un valido strumento di controllo per discriminare e quantificare il diossido di carbonio presente in atmosfera di provenienza fossile o biogenica. Ecco allora che ai settori applicativi tradizionali del 14C, quali le datazioni archeometriche, si affiancano nuovi ambiti legati da un lato al settore energetico per quanto riguarda le problematiche associate alle emissioni di impianti, ai combustibili, allo stoccaggio geologico della CO2, dall’altro al mercato in forte crescita dei cosiddetti prodotti biobased costituiti da materie prime rinnovabili. Nell’ambito del presente lavoro di tesi è stato quindi esplorato il mondo del radiocarbonio sia dal punto di vista strettamente tecnico e metodologico che dal punto di vista applicativo relativamente ai molteplici e diversificati campi d’indagine. E’ stato realizzato e validato un impianto di analisi basato sul metodo radiometrico mediante assorbimento diretto della CO2 ed analisi in scintillazione liquida apportando miglioramenti tecnologici ed accorgimenti procedurali volti a migliorare le performance del metodo in termini di semplicità, sensibilità e riproducibilità. Il metodo, pur rappresentando generalmente un buon compromesso rispetto alle metodologie tradizionalmente usate per l’analisi del 14C, risulta allo stato attuale ancora inadeguato a quei settori applicativi laddove è richiesta una precisione molto puntuale, ma competitivo per l’analisi di campioni moderni ad elevata concentrazione di 14C. La sperimentazione condotta su alcuni liquidi ionici, seppur preliminare e non conclusiva, apre infine nuove linee di ricerca sulla possibilità di utilizzare questa nuova classe di composti come mezzi per la cattura della CO2 e l’analisi del 14C in LSC.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In dieser Arbeit wird die bisher präziseste und erste direkte Hochpräzisionsmessung des g-Faktors eines einzelnen Protons präsentiert. Die Messung beruht auf der nicht-destruktiven Bestimmung der Zyklotronfrequenz und der Larmorfrequenz eines in einer Penning-Falle gespeicherten Protons. Zur Bestimmung der Larmorfrequenz wird die Spin-Flip-Wahrscheinlichkeit als Funktion einer externen Spin-Flip-Anregung aufgenommen. Zu diesem Zweck wird der kontinuierliche Stern-Gerlach Effekt verwendet, welcher zu einer Kopplung des Spin-Moments an die axiale Bewegung des Protons führt. Ein Spin-Flip zeigt sich dabei in einem Sprung der axialen Bewegungsfrequenz. Die Schwierigkeit besteht darin, diesen Frequenzsprung auf einem Hintergrund axialer Frequenzfluktuationen zu detektieren. Um diese Herausforderung zu bewältigen, wurden neuartige Methoden und Techniken angewandt. Zum einen wurden supraleitende Nachweise mit höchster Empfindlichkeit entwickelt, welche schnelle und damit präzise Frequenzmessungen erlauben. Zum anderen wurde eine auf dem statistischen Bayes Theorem basierende Spin-Flip-Analyse-Methode angewandt. Mit diesen Verbesserungen war es möglich, einzelne Spin-Flips eines einzelnen Protons zu beobachten. Dies wiederum ermöglichte die Anwendung der sogenannten Doppelfallen-Methode, und damit die eingangs erwähnte Messung des g-Faktors mit einer Präzision von 4.3 10^-9.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this clinical trial was to determine the active tactile sensibility of natural teeth and to obtain a statistical analysis method fitting a psychometric function through the observed data points. On 68 complete dentulous test persons (34 males, 34 females, mean age 45.9 ± 16.1 years), one pair of healthy natural teeth each was tested: n = 24 anterior teeth and n = 44 posterior teeth. The computer-assisted, randomized measurement was done by having the subjects bite on thin copper foils of different thickness (5-200 µm) inserted between the teeth. The threshold of active tactile sensibility was defined by the 50% value of correct answers. Additionally, the gradient of the sensibility curve and the support area (90-10% value) as a description of the shape of the sensibility curve were calculated. For modeling the sensibility curve, symmetric and asymmetric functions were used. The mean sensibility threshold was 14.2 ± 12.1 µm. The older the subject, the higher the tactile threshold (r = 0.42, p = 0.0006). The support area was 41.8 ± 43.3 µm. The higher the 50% threshold, the smaller the gradient of the curve and the larger the support area. The curves showing the active tactile sensibility of natural teeth demonstrate a tendency towards asymmetry, so that the active tactile sensibility of natural teeth can mathematically best be described by using the asymmetric Weibull function.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Craniosynostosis consists of a premature fusion of the sutures in an infant skull that restricts skull and brain growth. During the last decades, there has been a rapid increase of fundamentally diverse surgical treatment methods. At present, the surgical outcome has been assessed using global variables such as cephalic index, head circumference, and intracranial volume. However, these variables have failed in describing the local deformations and morphological changes that may have a role in the neurologic disorders observed in the patients. This report describes a rigid image registration-based method to evaluate outcomes of craniosynostosis surgical treatments, local quantification of head growth, and indirect intracranial volume change measurements. The developed semiautomatic analysis method was applied to computed tomography data sets of a 5-month-old boy with sagittal craniosynostosis who underwent expansion of the posterior skull with cranioplasty. Quantification of the local changes between pre- and postoperative images was quantified by mapping the minimum distance of individual points from the preoperative to the postoperative surface meshes, and indirect intracranial volume changes were estimated. The proposed methodology can provide the surgeon a tool for the quantitative evaluation of surgical procedures and detection of abnormalities of the infant skull and its development.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Assessment of lung volume (FRC) and ventilation inhomogeneities with ultrasonic flowmeter and multiple breath washout (MBW) has been used to provide important information about lung disease in infants. Sub-optimal adjustment of the mainstream molar mass (MM) signal for temperature and external deadspace may lead to analysis errors in infants with critically small tidal volume changes during breathing. METHODS: We measured expiratory temperature in human infants at 5 weeks of age and examined the influence of temperature and deadspace changes on FRC results with computer simulation modeling. A new analysis method with optimized temperature and deadspace settings was then derived, tested for robustness to analysis errors and compared with the previously used analysis methods. RESULTS: Temperature in the facemask was higher and variations of deadspace volumes larger than previously assumed. Both showed considerable impact upon FRC and LCI results with high variability when obtained with the previously used analysis model. Using the measured temperature we optimized model parameters and tested a newly derived analysis method, which was found to be more robust to variations in deadspace. Comparison between both analysis methods showed systematic differences and a wide scatter. CONCLUSION: Corrected deadspace and more realistic temperature assumptions improved the stability of the analysis of MM measurements obtained by ultrasonic flowmeter in infants. This new analysis method using the only currently available commercial ultrasonic flowmeter in infants may help to improve stability of the analysis and further facilitate assessment of lung volume and ventilation inhomogeneities in infants.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Light-frame wood buildings are widely built in the United States (U.S.). Natural hazards cause huge losses to light-frame wood construction. This study proposes methodologies and a framework to evaluate the performance and risk of light-frame wood construction. Performance-based engineering (PBE) aims to ensure that a building achieves the desired performance objectives when subjected to hazard loads. In this study, the collapse risk of a typical one-story light-frame wood building is determined using the Incremental Dynamic Analysis method. The collapse risks of buildings at four sites in the Eastern, Western, and Central regions of U.S. are evaluated. Various sources of uncertainties are considered in the collapse risk assessment so that the influence of uncertainties on the collapse risk of lightframe wood construction is evaluated. The collapse risks of the same building subjected to maximum considered earthquakes at different seismic zones are found to be non-uniform. In certain areas in the U.S., the snow accumulation is significant and causes huge economic losses and threatens life safety. Limited study has been performed to investigate the snow hazard when combined with a seismic hazard. A Filtered Poisson Process (FPP) model is developed in this study, overcoming the shortcomings of the typically used Bernoulli model. The FPP model is validated by comparing the simulation results to weather records obtained from the National Climatic Data Center. The FPP model is applied in the proposed framework to assess the risk of a light-frame wood building subjected to combined snow and earthquake loads. The snow accumulation has a significant influence on the seismic losses of the building. The Bernoulli snow model underestimates the seismic loss of buildings in areas with snow accumulation. An object-oriented framework is proposed in this study to performrisk assessment for lightframe wood construction. For home owners and stake holders, risks in terms of economic losses is much easier to understand than engineering parameters (e.g., inter story drift). The proposed framework is used in two applications. One is to assess the loss of the building subjected to mainshock-aftershock sequences. Aftershock and downtime costs are found to be important factors in the assessment of seismic losses. The framework is also applied to a wood building in the state of Washington to assess the loss of the building subjected to combined earthquake and snow loads. The proposed framework is proven to be an appropriate tool for risk assessment of buildings subjected to multiple hazards. Limitations and future works are also identified.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: Identification of the ventrointermediate thalamic nucleus (Vim) in modern 3T high-field MRI for image-based targeting in deep brain stimulation (DBS) is still challenging. To evaluate the usefulness and reliability of analyzing the connectivity with the cerebellum using Q-ball-calculation we performed a retrospective analysis. Method: 5 patients who underwent bilateral implantation of electrodes in the Vim for treatment of Essential Tremor between 2011 and 2012 received additional preoperative Q-ball imaging. Targeting was performed according to atlas coordinates and standard MRI. Additionally we performed a retrospective identification of the Vim by analyzing the connectivity of the thalamus with the dentate nucleus. The exact position of the active stimulation contact in the postoperative CT was correlated with the Vim as it was identified by Q-ball calculation. Results: Localization of the Vim by analysis of the connectivity between thalamus and cerebellum was successful in all 5 patients on both sides. The average position of the active contacts was 14.6 mm (SD 1.24) lateral, 5.37 mm (SD 0.094 posterior and 2.21 mm (SD 0.69) cranial of MC. The cranial portion of the dentato-rubro-thalamic tract was localized an average of 3.38 mm (SD 1.57) lateral and 1.5 mm (SD 1.22) posterior of the active contact. Conclusions: Connectivity analysis by Q-ball calculation provided direct visualization of the Vim in all cases. Our preliminary results suggest, that the target determined by connectivity analysis is valid and could possibly be used in addition to or even instead of atlas based targeting. Larger prospective calculations are needed to determine the robustness of this method in providing refined information useful for neurosurgical treatment of tremor.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The T2K long-baseline neutrino oscillation experiment in Japan needs precise predictions of the initial neutrino flux. The highest precision can be reached based on detailed measurements of hadron emission from the same target as used by T2K exposed to a proton beam of the same kinetic energy of 30 GeV. The corresponding data were recorded in 2007-2010 by the NA61/SHINE experiment at the CERN SPS using a replica of the T2K graphite target. In this paper details of the experiment, data taking, data analysis method and results from the 2007 pilot run are presented. Furthermore, the application of the NA61/SHINE measurements to the predictions of the T2K initial neutrino flux is described and discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The direct Bayesian admissible region approach is an a priori state free measurement association and initial orbit determination technique for optical tracks. In this paper, we test a hybrid approach that appends a least squares estimator to the direct Bayesian method on measurements taken at the Zimmerwald Observatory of the Astronomical Institute at the University of Bern. Over half of the association pairs agreed with conventional geometric track correlation and least squares techniques. The remaining pairs cast light on the fundamental limits of conducting tracklet association based solely on dynamical and geometrical information.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES The aim of this study was to optimise dexmedetomidine and alfaxalone dosing, for intramuscular administration with butorphanol, to perform minor surgeries in cats. METHODS Initially, cats were assigned to one of five groups, each composed of six animals and receiving, in addition to 0.3 mg/kg butorphanol intramuscularly, one of the following: (A) 0.005 mg/kg dexmedetomidine, 2 mg/kg alfaxalone; (B) 0.008 mg/kg dexmedetomidine, 1.5 mg/kg alfaxalone; (C) 0.012 mg/kg dexmedetomidine, 1 mg/kg alfaxalone; (D) 0.005 mg/kg dexmedetomidine, 1 mg/kg alfaxalone; and (E) 0.012 mg/kg dexmedetomidine, 2 mg/kg alfaxalone. Thereafter, a modified 'direct search' method, conducted in a stepwise manner, was used to optimise drug dosing. The quality of anaesthesia was evaluated on the basis of composite scores (one for anaesthesia and one for recovery), visual analogue scales and the propofol requirement to suppress spontaneous movements. The medians or means of these variables were used to rank the treatments; 'unsatisfactory' and 'promising' combinations were identified to calculate, through the equation first described by Berenbaum in 1990, new dexmedetomidine and alfaxalone doses to be tested in the next step. At each step, five combinations (one new plus the best previous four) were tested. RESULTS None of the tested combinations resulted in adverse effects. Four steps and 120 animals were necessary to identify the optimal drug combination (0.014 mg/kg dexmedetomidine, 2.5 mg/kg alfaxalone and 0.3 mg/kg butorphanol). CONCLUSIONS AND RELEVANCE The investigated drug mixture, at the doses found with the optimisation method, is suitable for cats undergoing minor clinical procedures.