897 resultados para Simulation-based methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Identifying and characterizing the genes responsible for inherited human diseases will ultimately lead to a more holistic understanding of disease pathogenesis, catalyze new diagnostic and treatment modalities, and provide insights into basic biological processes. This dissertation presents research aimed at delineating the genetic and molecular basis of human diseases through epigenetic and functional studies and can be divided into two independent areas of research. The first area of research describes the development of two high-throughput melting curve based methods to assay DNA methylation, referred to as McMSP and McCOBRA. The goal of this project was to develop DNA methylation methods that can be used to rapidly determine the DNA methylation status at a specific locus in a large number of samples. McMSP and McCOBRA provide several advantages over existing methods, as they are simple, accurate, robust, and high-throughput making them applicable to large-scale DNA methylation studies. McMSP and McCOBRA were then used in an epigenetic study of the complex disease Ankylosing spondylitis (AS). Specifically, I tested the hypothesis that aberrant patterns of DNA methylation in five AS candidate genes contribute to disease susceptibility. While no statistically significant methylation differences were observed between cases and controls, this is the first study to investigate the hypothesis that epigenetic variation contributes to AS susceptibility and therefore provides the conceptual framework for future studies. ^ In the second area of research, I performed experiments to better delimit the function of aryl hydrocarbon receptor-interacting protein-like 1 (AIPL1), which when mutated causes various forms of inherited blindness such as Leber congenital amaurosis. A yeast two-hybrid screen was performed to identify putative AIPL1-interacting proteins. After screening 2 × 106 bovine retinal cDNA library clones, 6 unique putative AIPL1-interacting proteins were identified. While these 6 AIPL1 protein-protein interactions must be confirmed, their identification is an important step in understanding the functional role of AIPL1 within the retina and will provide insight into the molecular mechanisms underlying inherited blindness. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objectives. Minimal Important Differences (MIDs) establish benchmarks for interpreting mean differences in clinical trials involving quality of life outcomes and inform discussions of clinically meaningful change in patient status. As such, the purpose of this study was to assess MIDs for the Functional Assessment of Cancer Therapy–Melanoma (FACT-M). ^ Methods. A prospective validation study of the FACT-M was performed with 273 patients with stage I to IV melanoma. FACT-M, Karnofsky Performance Status (KPS), and Eastern Cooperative Oncology Group Performance Status (ECOG-PS) scores were obtained at baseline and 3 months following enrollment. Anchor- and distribution-based methods were used to assess MIDs, and the correspondence between MID ranges derived from each method was evaluated. ^ Results. This study indicates that an approximate range for MIDs of the FACT-M subscales is between 5 to 8 points for the Trial Outcome Index, 4 to 5 points for the Melanoma Combined Subscale, 2 to 4 points for the Melanoma Subscale, and 1 to 2 points for the Melanoma Surgery Subscale. Each method produced similar but not identical ranges of MIDs. ^ Conclusions. The properties of the anchor instrument employed to derive MIDs directly affect resulting MID ranges and point values. When MIDs are offered as supportive evidence of a clinically meaningful change, the anchor instrument used to derive thresholds should be clearly stated along with evidence supporting the choice of anchor instrument as the most appropriate for the domain of interest. In this analysis, the KPS was a more appropriate measure than the ECOG-PS for assessing MIDs. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Detection of multidrug-resistant tuberculosis (MDR-TB), a frequent cause of treatment failure, takes 2 or more weeks to identify by culture. RIF-resistance is a hallmark of MDR-TB, and detection of mutations in the rpoB gene of Mycobacterium tuberculosis using molecular beacon probes with real-time quantitative polymerase chain reaction (qPCR) is a novel approach that takes ≤2 days. However, qPCR identification of resistant isolates, particularly for isolates with mixed RIF-susceptible and RIF-resistant bacteria, is reader dependent and limits its clinical use. The aim of this study was to develop an objective, reader-independent method to define rpoB mutants using beacon qPCR. This would facilitate the transition from a research protocol to the clinical setting, where high-throughput methods with objective interpretation are required. For this, DNAs from 107 M. tuberculosis clinical isolates with known susceptibility to RIF by culture-based methods were obtained from 2 regions where isolates have not previously been subjected to evaluation using molecular beacon qPCR: the Texas–Mexico border and Colombia. Using coded DNA specimens, mutations within an 81-bp hot spot region of rpoB were established by qPCR with 5 beacons spanning this region. Visual and mathematical approaches were used to establish whether the qPCR cycle threshold of the experimental isolate was significantly higher (mutant) compared to a reference wild-type isolate. Visual classification of the beacon qPCR required reader training for strains with a mixture of RIF-susceptible and RIF-resistant bacteria. Only then had the visual interpretation by an experienced reader had 100% sensitivity and 94.6% specificity versus RIF-resistance by culture phenotype and 98.1% sensitivity and 100% specificity versus mutations based on DNA sequence. The mathematical approach was 98% sensitive and 94.5% specific versus culture and 96.2% sensitive and 100% specific versus DNA sequence. Our findings indicate the mathematical approach has advantages over the visual reading, in that it uses a Microsoft Excel template to eliminate reader bias or inexperience, and allows objective interpretation from high-throughput analyses even in the presence of a mixture of RIF-resistant and RIF-susceptible isolates without the need for reader training.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Changes in the ventilation of the Southern Ocean are thought to play an important role on deglacial carbon and radiocarbon evolution, but have not been tested within a coupled climate-carbon model. Here, we present such a simulation based on a simple scenario of transient deglacial sinking of brines - sea-ice salt rejections - around Antarctica, which modulates Southern Ocean ventilation. This experiment is able to reproduce deglacial atmospheric changes in carbon and radiocarbon but also ocean radiocarbon records measured in the Atlantic, Southern and Pacific Oceans. Simulated for the first time in a fully coupled climate-carbon model including radiocarbon, our modeling results suggest that the deglacial changes in atmospheric carbon dioxide and radiocarbon were achieved by means of a breakdown in the glacial brine-induced stratification of the Southern Ocean.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The microbial population in samples of basalt drilled from the north of the Australian Antarctic Discordance (AAD) during Ocean Drilling Program Leg 187 were studied using deoxyribonucleic acid (DNA)-based methods and culturing techniques. The results showed the presence of a microbial population characteristic for the basalt environment. DNA sequence analysis revealed that microbes grouping within the Actinobacteria, green nonsulfur bacteria, the Cytophaga/Flavobacterium/Bacteroides (CFB) group, the Bacillus/Clostridium group, and the beta and gamma subclasses of the Proteobacteria were present in the basalt samples collected. The most dominant phylogenetic group, both in terms of the number of sequences retrieved and the intensities of the DNA bands obtained with the denaturing gradient gel electrophoresis analysis, was the gamma Proteobacteria. Enrichment cultures showed phylogenetic affiliation with the Actinobacteria, the CFB group, the Bacillus/Clostridium group, and the alpha, beta, gamma, and epsilon subclasses of the Proteobacteria. Comparison of native and enriched samples showed that few of the microbes found in native basalt samples grew in the enrichment cultures. Only seven clusters, two clusters within each of the CFB and Bacillus/Clostridium groups and five clusters within the gamma Proteobacteria, contained sequences from both native and enriched basalt samples with significant similarity. Results from cultivation experiments showed the presence of the physiological groups of iron reducers and methane producers. The presence of the iron/manganese-reducing bacterium Shewanella was confirmed with DNA analysis. The results indicate that iron reducers and lithotrophic methanogenic Archaea are indigenous to the ocean crust basalt and that the methanogenic Archaea may be important primary producers in this basaltic environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Effective static analyses have been proposed which infer bounds on the number of resolutions or reductions. These have the advantage of being independent from the platform on which the programs are executed and have been shown to be useful in a number of applications, such as granularity control in parallel execution. On the other hand, in distributed computation scenarios where platforms with different capabilities come into play, it is necessary to express costs in metrics that include the characteristics of the platform. In particular, it is specially interesting to be able to infer upper and lower bounds on actual execution times. With this objective in mind, we propose an approach which combines compile-time analysis for cost bounds with a one-time profiling of the platform in order to determine the valúes of certain parameters for a given platform. These parameters calíbrate a cost model which, from then on, is able to compute statically time bound functions for procedures and to predict with a significant degree of accuracy the execution times of such procedures in the given platform. The approach has been implemented and integrated in the CiaoPP system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The relationship between abstract interpretation and partial deduction has received considerable attention and (partial) integrations have been proposed starting from both the partial deduction and abstract interpretation perspectives. In this work we present what we argüe is the first fully described generic algorithm for efñcient and precise integration of abstract interpretation and partial deduction. Taking as starting point state-of-the-art algorithms for context-sensitive, polyvariant abstract interpretation and (abstract) partial deduction, we present an algorithm which combines the best of both worlds. Key ingredients include the accurate success propagation inherent to abstract interpretation and the powerful program transformations achievable by partial deduction. In our algorithm, the calis which appear in the analysis graph are not analyzed w.r.t. the original definition of the procedure but w.r.t. specialized definitions of these procedures. Such specialized definitions are obtained by applying both unfolding and abstract executability. Our framework is parametric w.r.t. different control strategies and abstract domains. Different combinations of such parameters correspond to existing algorithms for program analysis and specialization. Simultaneously, our approach opens the door to the efñcient computation of strictly more precise results than those achievable by each of the individual techniques. The algorithm is now one of the key components of the CiaoPP analysis and specialization system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The relationship between redd superimposition and spawning habitat availability was investigated in the brown trout (Salmo trutta L.) population inhabiting the river Castril (Granada, Spain). Redd surveys were conducted in 24 river sections to estimate the rate of redd superimposition. Used and available microhabitat was evaluated to compute the suitable spawning habitat (SSH) for brown trout. After analysing the microhabitat characteristics positively selected by females, SSH was defined as an area that met all the following five requirements: water depth between 10 and 50 cm, mean water velocity between 30 and 60 cm s)1, bottom water velocity between 15 and 60 cm s)1, substrate size between 4 and 30 mm and no embeddedness. Simple regression analyses showed that redd superimposition was not correlated with redd numbers, SSH or redd density. A simulation-based analysis was performed to estimate the superimposition rate if redds were randomly placed inside the SSH. This analysis revealed that the observed superimposition rate was higher than expected in 23 of 24 instances, this difference being significant (P menor que 0.05) in eight instances and right at the limit of statistical significance (P = 0.05) in another eight instances. Redd superimposition was high in sections with high redd density. High superimposition however was not exclusive to sections with high redd density and was found in moderate- and low-redd-density sections. This suggests that factors other than habitat availability are also responsible for redd superimposition. We argue that female preference for spawning over previously excavated redds may be the most likely explanation for high superimposition at lower densities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A module to estimate risks of ozone damage to vegetation has been implemented in the Integrated Assessment Modelling system for the Iberian Peninsula. It was applied to compute three different indexes for wheat and Holm oak; daylight AOT40 (cumulative ozone concentration over 40 ppb), cumulative ozone exposure index according to the Directive 2008/50/EC (AOT40-D) and PODY (Phytotoxic Ozone Dose over a given threshold of Y nmol m−2 s−1). The use of these indexes led to remarkable differences in spatial patterns of relative ozone risks on vegetation. Ozone critical levels were exceeded in most of the modelling domain and soil moisture content was found to have a significant impact on the results. According to the outputs of the model, daylight AOT40 constitutes a more conservative index than the AOT40-D. Additionally, flux-based estimations indicate high risk areas in Portugal for both wheat and Holm oak that are not identified by AOT-based methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we present a real-time tracking strategy based on direct methods for tracking tasks on-board UAVs, that is able to overcome problems posed by the challenging conditions of the task: e.g. constant vibrations, fast 3D changes, and limited capacity on-board. The vast majority of approaches make use of feature-based methods to track objects. Nonetheless, in this paper we show that although some of these feature-based solutions are faster, direct methods can be more robust under fast 3D motions (fast changes in position), some changes in appearance, constant vibrations (without requiring any specific hardware or software for video stabilization), and situations where part of the object to track is out the field of view of the camera. The performance of the proposed strategy is evaluated with images from real-flight tests using different evaluation mechanisms (e.g. accurate position estimation using a Vicon sytem). Results show that our tracking strategy performs better than well known feature-based algorithms and well known configurations of direct methods, and that the recovered data is robust enough for vision-in-the-loop tasks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The main problem of pedestrian dead-reckoning (PDR) using only a body-attached inertial measurement unit is the accumulation of heading errors. The heading provided by magnetometers in indoor buildings is in general not reliable and therefore it is commonly not used. Recently, a new method was proposed called heuristic drift elimination (HDE) that minimises the heading error when navigating in buildings. It assumes that the majority of buildings have their corridors parallel to each other, or they intersect at right angles, and consequently most of the time the person walks along a straight path with a heading constrained to one of the four possible directions. In this article we study the performance of HDE-based methods in complex buildings, i.e. with pathways also oriented at 45°, long curved corridors, and wide areas where non-oriented motion is possible. We explain how the performance of the original HDE method can be deteriorated in complex buildings, and also, how severe errors can appear in the case of false matches with the building's dominant directions. Although magnetic compassing indoors has a chaotic behaviour, in this article we analyse large data-sets in order to study the potential use that magnetic compassing has to estimate the absolute yaw angle of a walking person. Apart from these analysis, this article also proposes an improved HDE method called Magnetically-aided Improved Heuristic Drift Elimination (MiHDE), that is implemented over a PDR framework that uses foot-mounted inertial navigation with an extended Kalman filter (EKF). The EKF is fed with the MiHDE-estimated orientation error, gyro bias corrections, as well as the confidence over that corrections. We experimentally evaluated the performance of the proposed MiHDE-based PDR method, comparing it with the original HDE implementation. Results show that both methods perform very well in ideal orthogonal narrow-corridor buildings, and MiHDE outperforms HDE for non-ideal trajectories (e.g. curved paths) and also makes it robust against potential false dominant direction matchings.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we propose a flexible Multi-Agent Architecture together with a methodology for indoor location which allows us to locate any mobile station (MS) such as a Laptop, Smartphone, Tablet or a robotic system in an indoor environment using wireless technology. Our technology is complementary to the GPS location finder as it allows us to locate a mobile system in a specific room on a specific floor using the Wi-Fi networks. The idea is that any MS will have an agent known at a Fuzzy Location Software Agent (FLSA) with a minimum capacity processing at its disposal which collects the power received at different Access Points distributed around the floor and establish its location on a plan of the floor of the building. In order to do so it will have to communicate with the Fuzzy Location Manager Software Agent (FLMSA). The FLMSAs are local agents that form part of the management infrastructure of the Wi-Fi network of the Organization. The FLMSA implements a location estimation methodology divided into three phases (measurement, calibration and estimation) for locating mobile stations (MS). Our solution is a fingerprint-based positioning system that overcomes the problem of the relative effect of doors and walls on signal strength and is independent of the network device manufacturer. In the measurement phase, our system collects received signal strength indicator (RSSI) measurements from multiple access points. In the calibration phase, our system uses these measurements in a normalization process to create a radio map, a database of RSS patterns. Unlike traditional radio map-based methods, our methodology normalizes RSS measurements collected at different locations on a floor. In the third phase, we use Fuzzy Controllers to locate an MS on the plan of the floor of a building. Experimental results demonstrate the accuracy of the proposed method. From these results it is clear that the system is highly likely to be able to locate an MS in a room or adjacent room.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Leaf nitrogen and leaf surface area influence the exchange of gases between terrestrial ecosystems and the atmosphere, and play a significant role in the global cycles of carbon, nitrogen and water. The purpose of this study is to use field-based and satellite remote-sensing-based methods to assess leaf nitrogen pools in five diverse European agricultural landscapes located in Denmark, Scotland (United Kingdom), Poland, the Netherlands and Italy. REGFLEC (REGularized canopy reFLECtance) is an advanced image-based inverse canopy radiative transfer modelling system which has shown proficiency for regional mapping of leaf area index (LAI) and leaf chlorophyll (CHLl) using remote sensing data. In this study, high spatial resolution (10–20 m) remote sensing images acquired from the multispectral sensors aboard the SPOT (Satellite For Observation of Earth) satellites were used to assess the capability of REGFLEC for mapping spatial variations in LAI, CHLland the relation to leaf nitrogen (Nl) data in five diverse European agricultural landscapes. REGFLEC is based on physical laws and includes an automatic model parameterization scheme which makes the tool independent of field data for model calibration. In this study, REGFLEC performance was evaluated using LAI measurements and non-destructive measurements (using a SPAD meter) of leaf-scale CHLl and Nl concentrations in 93 fields representing crop- and grasslands of the five landscapes. Furthermore, empirical relationships between field measurements (LAI, CHLl and Nl and five spectral vegetation indices (the Normalized Difference Vegetation Index, the Simple Ratio, the Enhanced Vegetation Index-2, the Green Normalized Difference Vegetation Index, and the green chlorophyll index) were used to assess field data coherence and to serve as a comparison basis for assessing REGFLEC model performance. The field measurements showed strong vertical CHLl gradient profiles in 26% of fields which affected REGFLEC performance as well as the relationships between spectral vegetation indices (SVIs) and field measurements. When the range of surface types increased, the REGFLEC results were in better agreement with field data than the empirical SVI regression models. Selecting only homogeneous canopies with uniform CHLl distributions as reference data for evaluation, REGFLEC was able to explain 69% of LAI observations (rmse = 0.76), 46% of measured canopy chlorophyll contents (rmse = 719 mg m−2) and 51% of measured canopy nitrogen contents (rmse = 2.7 g m−2). Better results were obtained for individual landscapes, except for Italy, where REGFLEC performed poorly due to a lack of dense vegetation canopies at the time of satellite recording. Presence of vegetation is needed to parameterize the REGFLEC model. Combining REGFLEC- and SVI-based model results to minimize errors for a "snap-shot" assessment of total leaf nitrogen pools in the five landscapes, results varied from 0.6 to 4.0 t km−2. Differences in leaf nitrogen pools between landscapes are attributed to seasonal variations, extents of agricultural area, species variations, and spatial variations in nutrient availability. In order to facilitate a substantial assessment of variations in Nl pools and their relation to landscape based nitrogen and carbon cycling processes, time series of satellite data are needed. The upcoming Sentinel-2 satellite mission will provide new multiple narrowband data opportunities at high spatio-temporal resolution which are expected to further improve remote sensing capabilities for mapping LAI, CHLl and Nl.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Magnetoencephalography (MEG) provides a direct measure of brain activity with high combined spatiotemporal resolution. Preprocessing is necessary to reduce contributions from environmental interference and biological noise. New method The effect on the signal-to-noise ratio of different preprocessing techniques is evaluated. The signal-to-noise ratio (SNR) was defined as the ratio between the mean signal amplitude (evoked field) and the standard error of the mean over trials. Results Recordings from 26 subjects obtained during and event-related visual paradigm with an Elekta MEG scanner were employed. Two methods were considered as first-step noise reduction: Signal Space Separation and temporal Signal Space Separation, which decompose the signal into components with origin inside and outside the head. Both algorithm increased the SNR by approximately 100%. Epoch-based methods, aimed at identifying and rejecting epochs containing eye blinks, muscular artifacts and sensor jumps provided an SNR improvement of 5–10%. Decomposition methods evaluated were independent component analysis (ICA) and second-order blind identification (SOBI). The increase in SNR was of about 36% with ICA and 33% with SOBI. Comparison with existing methods No previous systematic evaluation of the effect of the typical preprocessing steps in the SNR of the MEG signal has been performed. Conclusions The application of either SSS or tSSS is mandatory in Elekta systems. No significant differences were found between the two. While epoch-based methods have been routinely applied the less often considered decomposition methods were clearly superior and therefore their use seems advisable.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to present a simulation‐based evaluation method for the comparison of different organizational forms and software support levels in the field of supply chain management (SCM). Design/methodology/approach – Apart from widely known logistic performance indicators, the discrete event simulation model considers explicitly coordination cost as stemming from iterative administration procedures. Findings - The method is applied to an exemplary supply chain configuration considering various parameter settings. Curiously, additional coordination cost does not always result in improved logistic performance. Influence factor variations lead to different organizational recommendations. The results confirm the high importance of (up to now) disregarded dimensions when evaluating SCM concepts and IT tools. Research limitations/implications – The model is based on simplified product and network structures. Future research shall include more complex, real world configurations. Practical implications – The developed method is designed for the identification of improvement potential when SCM software is employed. Coordination schemes based only on ERP systems are valid alternatives in industrial practice because significant investment IT can be avoided. Therefore, the evaluation of these coordination procedures, in particular the cost due to iterations, is of high managerial interest and the method provides a comprehensive tool for strategic IT decision making. Originality/value – Reviewed literature is mostly focused on the benefits of SCM software implementations. However, ERP system based supply chain coordination is still widespread industrial practice but associated coordination cost has not been addressed by researchers.