954 resultados para Local linearization methods
Resumo:
PURPOSE: To evaluate the rate of tumor recurrence within the irradiated volume after initial low-dose irradiation of limited-stage small-cell lung cancer (SCLC), to assess the tolerance of a sequential combination of low-dose chest irradiation followed by chemotherapy, and to confirm the responsiveness of limited-stage SCLC to low-dose irradiation. METHODS AND MATERIALS: In this pilot study, 26 patients with limited-stage SCLC were treated by first-line 20-Gy thoracic irradiation followed 3 weeks later by chemotherapy (cisplatin, doxorubicin, and etoposide for six cycles). RESULTS: We present our final results with a median follow-up of surviving patients of 7 years. The response rate to this low-dose irradiation was 83%, with an overall response rate to radiochemotherapy of 96% and a median survival of 21 months. No unexpected early or late toxicity was observed. The rate of initial isolated local failure was 8%, which compares favorably with other published series using higher doses of radiochemotherapy. CONCLUSION: An initial chest irradiation of 20 Gy before chemotherapy could be sufficient to reduce the risk of local failure during the time of survival of patients with limited-stage SCLC. Potential advantages of this treatment may be the prevention of resistance mechanisms to radiotherapy induced by preliminary chemotherapy and a reduced radiation-induced toxicity.
Resumo:
The relationship between electrophysiological and functional magnetic resonance imaging (fMRI) signals remains poorly understood. To date, studies have required invasive methods and have been limited to single functional regions and thus cannot account for possible variations across brain regions. Here we present a method that uses fMRI data and singe-trial electroencephalography (EEG) analyses to assess the spatial and spectral dependencies between the blood-oxygenation-level-dependent (BOLD) responses and the noninvasively estimated local field potentials (eLFPs) over a wide range of frequencies (0-256 Hz) throughout the entire brain volume. This method was applied in a study where human subjects completed separate fMRI and EEG sessions while performing a passive visual task. Intracranial LFPs were estimated from the scalp-recorded data using the ELECTRA source model. We compared statistical images from BOLD signals with statistical images of each frequency of the eLFPs. In agreement with previous studies in animals, we found a significant correspondence between LFP and BOLD statistical images in the gamma band (44-78 Hz) within primary visual cortices. In addition, significant correspondence was observed at low frequencies (<14 Hz) and also at very high frequencies (>100 Hz). Effects within extrastriate visual areas showed a different correspondence that not only included those frequency ranges observed in primary cortices but also additional frequencies. Results therefore suggest that the relationship between electrophysiological and hemodynamic signals thus might vary both as a function of frequency and anatomical region.
Resumo:
PURPOSE OF THE STUDY: This prospective study reports our preliminary results with local anaesthesia (LA) for carotid endarterectomy (CEA). MATERIAL AND METHODS: Twenty CEA in nineteen patients were performed using a three-stage local infiltration technique. CEA were performed through a short Duplex-assisted skin incision (median length: 55 mm) using a retro-jugular approach and polyurethane patch closure (median length: 35 mm). RESULTS: There were 13 men and 6 women with a mean age of 71.2 years. The indications of CEA were asymptomatic lesions in 11 cases, stroke in 7 cases and transient ischaemic attack in 2 cases. The median degree of internal carotid artery stenosis was 90%. One patient (5%) required an intraluminal shunt. There were no peri-operative deaths, stroke or conversion to general anaesthesia (GA). The median length of stay was 3 days. CONCLUSIONS: LA is a good alternative to GA. It can be used after a feasibility study and a short teaching procedure. In our centre, it is a safe and effective procedure associated with low morbidity, high acceptance by patients and a short hospital stay.
3D coronary vessel wall imaging utilizing a local inversion technique with spiral image acquisition.
Resumo:
Current 2D black blood coronary vessel wall imaging suffers from a relatively limited coverage of the coronary artery tree. Hence, a 3D approach facilitating more extensive coverage would be desirable. The straightforward combination of a 3D-acquisition technique together with a dual inversion prepulse can decrease the effectiveness of the black blood preparation. To minimize artifacts from insufficiently suppressed blood signal of the nearby blood pools, and to reduce residual respiratory motion artifacts from the chest wall, a novel local inversion technique was implemented. The combination of a nonselective inversion prepulse with a 2D selective local inversion prepulse allowed for suppression of unwanted signal outside a user-defined region of interest. Among 10 subjects evaluated using a 3D-spiral readout, the local inversion pulse effectively suppressed signal from ventricular blood, myocardium, and chest wall tissue in all cases. The coronary vessel wall could be visualized within the entire imaging volume.
Resumo:
Blowing and drifting of snow is a major concern for transportation efficiency and road safety in regions where their development is common. One common way to mitigate snow drift on roadways is to install plastic snow fences. Correct design of snow fences is critical for road safety and maintaining the roads open during winter in the US Midwest and other states affected by large snow events during the winter season and to maintain costs related to accumulation of snow on the roads and repair of roads to minimum levels. Of critical importance for road safety is the protection against snow drifting in regions with narrow rights of way, where standard fences cannot be deployed at the recommended distance from the road. Designing snow fences requires sound engineering judgment and a thorough evaluation of the potential for snow blowing and drifting at the construction site. The evaluation includes site-specific design parameters typically obtained with semi-empirical relations characterizing the local transport conditions. Among the critical parameters involved in fence design and assessment of their post-construction efficiency is the quantification of the snow accumulation at fence sites. The present study proposes a joint experimental and numerical approach to monitor snow deposits around snow fences, quantitatively estimate snow deposits in the field, asses the efficiency and improve the design of snow fences. Snow deposit profiles were mapped using GPS based real-time kinematic surveys (RTK) conducted at the monitored field site during and after snow storms. The monitored site allowed testing different snow fence designs under close to identical conditions over four winter seasons. The study also discusses the detailed monitoring system and analysis of weather forecast and meteorological conditions at the monitored sites. A main goal of the present study was to assess the performance of lightweight plastic snow fences with a lower porosity than the typical 50% porosity used in standard designs of such fences. The field data collected during the first winter was used to identify the best design for snow fences with a porosity of 50%. Flow fields obtained from numerical simulations showed that the fence design that worked the best during the first winter induced the formation of an elongated area of small velocity magnitude close to the ground. This information was used to identify other candidates for optimum design of fences with a lower porosity. Two of the designs with a fence porosity of 30% that were found to perform well based on results of numerical simulations were tested in the field during the second winter along with the best performing design for fences with a porosity of 50%. Field data showed that the length of the snow deposit away from the fence was reduced by about 30% for the two proposed lower-porosity (30%) fence designs compared to the best design identified for fences with a porosity of 50%. Moreover, one of the lower-porosity designs tested in the field showed no significant snow deposition within the bottom gap region beneath the fence. Thus, a major outcome of this study is to recommend using plastic snow fences with a porosity of 30%. It is expected that this lower-porosity design will continue to work well for even more severe snow events or for successive snow events occurring during the same winter. The approach advocated in the present study allowed making general recommendations for optimizing the design of lower-porosity plastic snow fences. This approach can be extended to improve the design of other types of snow fences. Some preliminary work for living snow fences is also discussed. Another major contribution of this study is to propose, develop protocols and test a novel technique based on close range photogrammetry (CRP) to quantify the snow deposits trapped snow fences. As image data can be acquired continuously, the time evolution of the volume of snow retained by a snow fence during a storm or during a whole winter season can, in principle, be obtained. Moreover, CRP is a non-intrusive method that eliminates the need to perform man-made measurements during the storms, which are difficult and sometimes dangerous to perform. Presently, there is lots of empiricism in the design of snow fences due to lack of data on fence storage capacity on how snow deposits change with the fence design and snow storm characteristics and in the estimation of the main parameters used by the state DOTs to design snow fences at a given site. The availability of such information from CRP measurements should provide critical data for the evaluation of the performance of a certain snow fence design that is tested by the IDOT. As part of the present study, the novel CRP method is tested at several sites. The present study also discusses some attempts and preliminary work to determine the snow relocation coefficient which is one of the main variables that has to be estimated by IDOT engineers when using the standard snow fence design software (Snow Drift Profiler, Tabler, 2006). Our analysis showed that standard empirical formulas did not produce reasonable values when applied at the Iowa test sites monitored as part of the present study and that simple methods to estimate this variable are not reliable. The present study makes recommendations for the development of a new methodology based on Large Scale Particle Image Velocimetry that can directly measure the snow drift fluxes and the amount of snow relocated by the fence.
Resumo:
It is well known that hospital malnutrition is a highly prevalent condition associated to increase morbidity and mortality as well as related healthcare costs. Although previous studies have already measured the prevalence and/or costs of hospital nutrition in our country, their local focus (at regional or even hospital level) make that the true prevalence and economic impact of hospital malnutrition for the National Health System remain unknown in Spain. The PREDyCES® (Prevalence of hospital malnutrition and associated costs in Spain) study was aimed to assess the prevalence of hospital malnutrition in Spain and to estimate related costs. Some aspects made this study unique: a) It was the first study in a representative sample of hospitals of Spain; b) different measures to assess hospital malnutrition (NRS2002, MNA as well as anthropometric and biochemical markers) where used both at admission and discharge and, c) the economic consequences of malnutrition where estimated using the perspective of the Spanish National Health System.
Resumo:
Although many larger Iowa cities have staff traffic engineers who have a dedicated interest in safety, smaller jurisdictions do not. Rural agencies and small communities must rely on consultants, if available, or local staff to identify locations with a high number of crashes and to devise mitigating measures. However, smaller agencies in Iowa have other available options to receive assistance in obtaining and interpreting crash data. These options are addressed in this manual. Many proposed road improvements or alternatives can be evaluated using methods that do not require in-depth engineering analysis. The Iowa Department of Transportation (DOT) supported developing this manual to provide a tool that assists communities and rural agencies in identifying and analyzing local roadway-related traffic safety concerns. In the past, a limited number of traffic safety professionals had access to adequate tools and training to evaluate potential safety problems quickly and efficiently and select possible solutions. Present-day programs and information are much more conducive to the widespread dissemination of crash data, mapping, data comparison, and alternative selections and comparisons. Information is available and in formats that do not require specialized training to understand and use. This manual describes several methods for reviewing crash data at a given location, identifying possible contributing causes, selecting countermeasures, and conducting economic analyses for the proposed mitigation. The Federal Highway Administration (FHWA) has also developed other analysis tools, which are described in the manual. This manual can also serve as a reference for traffic engineers and other analysts.
Resumo:
INTRODUCTION: Osteoset(®) T is a calcium sulphate void filler containing 4% tobramycin sulphate, used to treat bone and soft tissue infections. Despite systemic exposure to the antibiotic, there are no pharmacokinetic studies in humans published so far. Based on the observations made in our patients, a model predicting tobramycin serum levels and evaluating their toxicity potential is presented. METHODS: Following implantation of Osteoset(®) T, tobramycin serum concentrations were monitored systematically. A pharmacokinetic analysis was performed using a non-linear mixed effects model based on a one compartment model with first-degree absorption. RESULTS: Data from 12 patients treated between October 2006 and March 2008 were analysed. Concentration profiles were consistent with the first-order slow release and single-compartment kinetics, whilst showing important variability. Predicted tobramycin serum concentrations depended clearly on both implanted drug amount and renal function. DISCUSSION AND CONCLUSION: Despite the popularity of aminoglycosides for local antibiotic therapy, pharmacokinetic data for this indication are scarce, and not available for calcium sulphate as carrier material. Systemic exposure to tobramycin after implantation of Osteoset(®) T appears reassuring regarding toxicity potential, except in case of markedly impaired renal function. We recommend in adapting the dosage to the estimated creatinine clearance rather than solely to the patient's weight.
Resumo:
The proposed Federal Highway Administration (FHWA) amendments to the Manual of Uniform Traffic Control Devices (MUTCD) will change the way local agencies manage their pavement markings and places a focus on pavement marking quality and management methods. This research effort demonstrates how a pavement marking maintenance method could be developed and used at the local agency level. The report addresses the common problems faced by agencies in achieving good pavement marking quality and provides recommendations specific towards these problems in terms of assessing pavement marking needs, selecting pavement marking materials, contracting out pavement marking services, measuring and monitoring performance, and in developing management tools to visualize pavement marking needs in a GIS format. The research includes five case studies, three counties and two cities, where retroreflectivity was measured over a spring and fall season and then mapped to evaluate pavement marking performance and needs. The research also includes over 35 field demonstrations (installation and monitoring) of both longitudinal and transverse durable markings in a variety of local agency settings all within an intense snow plow state.
Resumo:
Simulated-annealing-based conditional simulations provide a flexible means of quantitatively integrating diverse types of subsurface data. Although such techniques are being increasingly used in hydrocarbon reservoir characterization studies, their potential in environmental, engineering and hydrological investigations is still largely unexploited. Here, we introduce a novel simulated annealing (SA) algorithm geared towards the integration of high-resolution geophysical and hydrological data which, compared to more conventional approaches, provides significant advancements in the way that large-scale structural information in the geophysical data is accounted for. Model perturbations in the annealing procedure are made by drawing from a probability distribution for the target parameter conditioned to the geophysical data. This is the only place where geophysical information is utilized in our algorithm, which is in marked contrast to other approaches where model perturbations are made through the swapping of values in the simulation grid and agreement with soft data is enforced through a correlation coefficient constraint. Another major feature of our algorithm is the way in which available geostatistical information is utilized. Instead of constraining realizations to match a parametric target covariance model over a wide range of spatial lags, we constrain the realizations only at smaller lags where the available geophysical data cannot provide enough information. Thus we allow the larger-scale subsurface features resolved by the geophysical data to have much more due control on the output realizations. Further, since the only component of the SA objective function required in our approach is a covariance constraint at small lags, our method has improved convergence and computational efficiency over more traditional methods. Here, we present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on a synthetic data set, and then applied to data collected at the Boise Hydrogeophysical Research Site.
Resumo:
The M-Coffee server is a web server that makes it possible to compute multiple sequence alignments (MSAs) by running several MSA methods and combining their output into one single model. This allows the user to simultaneously run all his methods of choice without having to arbitrarily choose one of them. The MSA is delivered along with a local estimation of its consistency with the individual MSAs it was derived from. The computation of the consensus multiple alignment is carried out using a special mode of the T-Coffee package [Notredame, Higgins and Heringa (T-Coffee: a novel method for fast and accurate multiple sequence alignment. J. Mol. Biol. 2000; 302: 205-217); Wallace, O'Sullivan, Higgins and Notredame (M-Coffee: combining multiple sequence alignment methods with T-Coffee. Nucleic Acids Res. 2006; 34: 1692-1699)] Given a set of sequences (DNA or proteins) in FASTA format, M-Coffee delivers a multiple alignment in the most common formats. M-Coffee is a freeware open source package distributed under a GPL license and it is available either as a standalone package or as a web service from www.tcoffee.org.
Resumo:
The objective of this work was to determine how taxonomy benefited from the ecological quantitative and site-based sampling methods in enchytraeids studies. Enchytraeids (small relatives of earthworms) were sampled in different phases of rain forest regeneration in the southern Mata Atlântica in Paraná, Brazil. The research combined ecological and taxonomic work, because enchytraeids are poorly studied and difficult to identify, and many new species were expected. The provision of large numbers of specimens enabled the test of species diagnoses by investigating the ranges of character variations in a larger series of specimens. Simplified species diagnoses adapted to the local conditions that allowed the identification of all specimens, juveniles included, were developed. Key characters and character states are presented for the three genera: Achaeta, Hemienchytraeus and Guaranidrilus. Among several new species, a rare species, possibly a remnant of the autochthonous forest fauna, was found and described.
Local re-inversion coronary MR angiography: arterial spin-labeling without the need for subtraction.
Resumo:
PURPOSE: To implement a double-inversion bright-blood coronary MR angiography sequence using a cylindrical re-inversion prepulse for selective visualization of the coronary arteries. MATERIALS AND METHODS: Local re-inversion bright-blood magnetization preparation was implemented using a nonselective inversion followed by a cylindrical aortic re-inversion prepulse. After an inversion delay that allows for in-flow of the labeled blood-pool into the coronary arteries, three-dimensional radial steady-state free-precession (SSFP) imaging (repetition/echo time, 7.2/3.6 ms; flip angle, 120 degrees, 16 profiles per RR interval; field of view, 360 mm; matrix, 512, twelve 3-mm slices) is performed. Coronary MR angiography was performed in three healthy volunteers and in one patient on a commercial 1.5 Tesla whole-body MR System. RESULTS: In all subjects, coronary arteries were selectively visualized with positive contrast. In addition, a middle-grade stenosis of the proximal right coronary artery was seen in one patient. CONCLUSION: A novel T1 contrast-enhancement strategy is presented for selective visualization of the coronary arteries without extrinsic contrast medium application. In comparison to former arterial spin-labeling schemes, the proposed magnetization preparation obviates the need for a second data set and subtraction.
Resumo:
This paper presents a validation study on statistical nonsupervised brain tissue classification techniques in magnetic resonance (MR) images. Several image models assuming different hypotheses regarding the intensity distribution model, the spatial model and the number of classes are assessed. The methods are tested on simulated data for which the classification ground truth is known. Different noise and intensity nonuniformities are added to simulate real imaging conditions. No enhancement of the image quality is considered either before or during the classification process. This way, the accuracy of the methods and their robustness against image artifacts are tested. Classification is also performed on real data where a quantitative validation compares the methods' results with an estimated ground truth from manual segmentations by experts. Validity of the various classification methods in the labeling of the image as well as in the tissue volume is estimated with different local and global measures. Results demonstrate that methods relying on both intensity and spatial information are more robust to noise and field inhomogeneities. We also demonstrate that partial volume is not perfectly modeled, even though methods that account for mixture classes outperform methods that only consider pure Gaussian classes. Finally, we show that simulated data results can also be extended to real data.
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.