953 resultados para Reliable Computations
Resumo:
Chagas` disease is an illness that affects millions of people in Central and South America, The search for both a prophylactic drug to be added to human blood as well as a safe and reliable therapeutic drug are greatly needed to control such disease. Herein, we report the trypanocidal activity of 15 crude extracts and 14 Compounds (limonoids and triterpenes) as well as the isolation of 25 known compounds (6 limonoids, 12 triterpenes, 1 sesquiterpene, 5 steroids, and 1 flavonoid) from Cedrela fissilis. The present study shows that this plant is a Promising Source of active compounds for the control of Chagas` disease. The inhibitory activity found for odoratol indicates that it is potentially useful as an alternative for the chemoprophylactic gentian violet.
Resumo:
Background/Aims: It is a challenge to adapt traditional in vitro diffusion experiments to ocular tissue. Thus, the aim of this work was to present experimental evidence on the integrity of the porcine cornea, barrier function and maintenance of electrical properties for 6 h of experiment when the tissue is mounted on an inexpensive and easy-to-use in vitro model for ocular iontophoresis. Methods: A modified Franz diffusion cell containing two ports for the insertion of the electrodes and a receiving compartment that does not need gassing with carbogen was used in the studies. Corneal electron transmission microscopy images were obtained, and diffusion experiments with fluorescent markers were performed to examine the integrity of the barrier function. The preservation of the negatively charged corneal epithelium was verified by the determination of the electro-osmotic flow of a hydrophilic and non-ionized molecule. Results: The diffusion cell was able to maintain the temperature, homogenization, porcine epithelial corneal structure integrity, barrier function and electrical characteristics throughout the 6 h of permeation experiment, without requiring CO(2) gassing when the receiving chamber was filled with 25 m M of HEPES buffer solution. Conclusion: The system described here is inexpensive, easy to handle and reliable as an in vitro model for iontophoretic ocular delivery studies. Copyright (C) 2010 S. Karger AG, Basel
Resumo:
The supervised pattern recognition methods K-Nearest Neighbors (KNN), stepwise discriminant analysis (SDA), and soft independent modelling of class analogy (SIMCA) were employed in this work with the aim to investigate the relationship between the molecular structure of 27 cannabinoid compounds and their analgesic activity. Previous analyses using two unsupervised pattern recognition methods (PCA-principal component analysis and HCA-hierarchical cluster analysis) were performed and five descriptors were selected as the most relevants for the analgesic activity of the compounds studied: R (3) (charge density on substituent at position C(3)), Q (1) (charge on atom C(1)), A (surface area), log P (logarithm of the partition coefficient) and MR (molecular refractivity). The supervised pattern recognition methods (SDA, KNN, and SIMCA) were employed in order to construct a reliable model that can be able to predict the analgesic activity of new cannabinoid compounds and to validate our previous study. The results obtained using the SDA, KNN, and SIMCA methods agree perfectly with our previous model. Comparing the SDA, KNN, and SIMCA results with the PCA and HCA ones we could notice that all multivariate statistical methods classified the cannabinoid compounds studied in three groups exactly in the same way: active, moderately active, and inactive.
Resumo:
Objective:To investigate the effects of bilateral, surgically induced functional inhibition of the subthalamic nucleus (STN) on general language, high level linguistic abilities, and semantic processing skills in a group of patients with Parkinson’s disease. Methods:Comprehensive linguistic profiles were obtained up to one month before and three months after bilateral implantation of electrodes in the STN during active deep brain stimulation (DBS) in five subjects with Parkinson’s disease (mean age, 63.2 years). Equivalent linguistic profiles were generated over a three month period for a non-surgical control cohort of 16 subjects with Parkinson’s disease (NSPD) (mean age, 64.4 years). Education and disease duration were similar in the two groups. Initial assessment and three month follow up performance profiles were compared within subjects by paired t tests. Reliability change indices (RCI), representing clinically significant alterations in performance over time, were calculated for each of the assessment scores achieved by the five STN-DBS cases and the 16 NSPD controls, relative to performance variability within a group of 16 non-neurologically impaired adults (mean age, 61.9 years). Proportions of reliable change were then compared between the STN-DBS and NSPD groups. Results:Paired comparisons within the STN-DBS group showed prolonged postoperative semantic processing reaction times for a range of word types coded for meanings and meaning relatedness. Case by case analyses of reliable change across language assessments and groups revealed differences in proportions of change over time within the STN-DBS and NSPD groups in the domains of high level linguistics and semantic processing. Specifically, when compared with the NSPD group, the STN-DBS group showed a proportionally significant (p
Resumo:
Traditionally the basal ganglia have been implicated in motor behavior, as they are involved in both the execution of automatic actions and the modification of ongoing actions in novel contexts. Corresponding to cognition, the role of the basal ganglia has not been defined as explicitly. Relative to linguistic processes, contemporary theories of subcortical participation in language have endorsed a role for the globus pallidus internus (GPi) in the control of lexical-semantic operations. However, attempts to empirically validate these postulates have been largely limited to neuropsychological investigations of verbal fluency abilities subsequent to pallidotomy. We evaluated the impact of bilateral posteroventral pallidotomy (BPVP) on language function across a range of general and high-level linguistic abilities, and validated/extended working theories of pallidal participation in language. Comprehensive linguistic profiles were compiled up to 1 month before and 3 months after BPVP in 6 subjects with Parkinson's disease (PD). Commensurate linguistic profiles were also gathered over a 3-month period for a nonsurgical control cohort of 16 subjects with PD and a group of 16 non-neurologically impaired controls (NC). Nonparametric between-groups comparisons were conducted and reliable change indices calculated, relative to baseline/3-month follow-up difference scores. Group-wise statistical comparisons between the three groups failed to reveal significant postoperative changes in language performance. Case-by-case data analysis relative to clinically consequential change indices revealed reliable alterations in performance across several language variables as a consequence of BPVP. These findings lend support to models of subcortical participation in language, which promote a role for the GPi in lexical-semantic manipulation mechanisms. Concomitant improvements and decrements in postoperative performance were interpreted within the context of additive and subtractive postlesional effects. Relative to parkinsonian cohorts, clinically reliable versus statistically significant changes on a case by case basis may provide the most accurate method of characterizing the way in which pathophysiologically divergent basal ganglia linguistic circuits respond to BPVP.
Resumo:
This study determined the inter-tester and intra-tester reliability of physiotherapists measuring functional motor ability of traumatic brain injury clients using the Clinical Outcomes Variable Scale (COVS). To test inter-tester reliability, 14 physiotherapists scored the ability of 16 videotaped patients to execute the items that comprise the COVS. Intra-tester reliability was determined by four physiotherapists repeating their assessments after one week, and three months later. The intra-class correlation coefficients (ICC) were very high for both inter-tester reliability (ICC > 0.97 for total COVS scores, ICC > 0.93 for individual COVS items) and intra-tester reliability (ICC > 0.97). This study demonstrates that physiotherapists are reliable in the administration of the COVS.
Resumo:
In recent years, the phrase 'genomic medicine' has increasingly been used to describe a new development in medicine that holds great promise for human health. This new approach to health care uses the knowledge of an individual's genetic make-up to identify those that are at a higher risk of developing certain diseases and to intervene at an earlier stage to prevent these diseases. Identifying genes that are involved in disease aetiology will provide researchers with tools to develop better treatments and cures. A major role within this field is attributed to 'predictive genomic medicine', which proposes screening healthy individuals to identify those who carry alleles that increase their susceptibility to common diseases, such as cancers and heart disease. Physicians could then intervene even before the disease manifests and advise individuals with a higher genetic risk to change their behaviour - for instance, to exercise or to eat a healthier diet - or offer drugs or other medical treatment to reduce their chances of developing these diseases. These promises have fallen on fertile ground among politicians, health-care providers and the general public, particularly in light of the increasing costs of health care in developed societies. Various countries have established databases on the DNA and health information of whole populations as a first step towards genomic medicine. Biomedical research has also identified a large number of genes that could be used to predict someone's risk of developing a certain disorder. But it would be premature to assume that genomic medicine will soon become reality, as many problems remain to be solved. Our knowledge about most disease genes and their roles is far from sufficient to make reliable predictions about a patient’s risk of actually developing a disease. In addition, genomic medicine will create new political, social, ethical and economic challenges that will have to be addressed in the near future.
Resumo:
This experiment investigated whether the stability of rhythmic unimanual movements is primarily a function of perceptual/spatial orientation or neuro-mechanical in nature. Eight participants performed rhythmic flexion and extension movements of the left wrist for 30 s at a frequency of 2.25 Hz paced by an auditory metronome. Each participant performed 8 flex-on-the-beat trials and 8 extend-on-the-beat trials in one of two load conditions, loaded and unload. In the loaded condition, a servo-controlled torque motor was used to apply a small viscous load that resisted the flexion phase of the movement only. Both the amplitude and frequency of the movement generated in the loaded and unloaded conditions were statistically equivalent. However, in the loaded condition movements in which participants were required to flex-on-the-beat became less stable (more variable) while extend-on-the-beat movements remained unchanged compared with the unload condition. The small alteration in required muscle force was sufficient to result in reliable changes in movement stability even a situation where the movement kinematics were identical. These findings support the notion that muscular constraints, independent of spatial dependencies, can be sufficiently strong to reliably influence coordination in a simple unimanual task.
Resumo:
40Ar/39Ar laser incremental heating analyses of individual grains of supergene jarosite, alunite, and cryptomelane from weathering profiles in the Dugald River area, Queensland, Australia, show a strong positive correlation between a sample’s age and its elevation. We analyzed 125 grains extracted from 35 hand specimens collected from weathering profiles at 11 sites located at 3 distinct elevations. The highest elevation profile hosts the oldest supergene minerals, whereas progressively younger samples occur at lower positions in the landscape. The highest elevation sampling sites (three sites), located on top of an elongated mesa (255 to 275 m elevation), yield ages in the 16 to 12 Ma range. Samples from an intermediate elevation site (225 to 230 m elevation) yield ages in the 6 to 4 Ma range. Samples collected at the lowest elevation sites (200 to 220 m elevation) yield ages in the 2.2 to 0.8 Ma interval. Grains of supergene alunite, jarosite, and cryptomelane analyzed from individual single hand specimens yield reproducible results, confirming the suitability of these minerals to 40Ar/39Ar geochronology. Multiple samples collected from the same site also yield reproducible results, indicating that the ages measured are true precipitation ages for the samples analyzed. Different sites, up to 3 km apart, sampled from weathering profiles at the same elevation again yield reproducible results. The consistency of results confirms that 40Ar/39Ar geochronology of supergene jarosite, alunite, and cryptomelane yields ages of formation of weathering profiles, providing a reliable numerical basis for differentiating and correlating these profiles. The age versus elevation relationship obtained suggest that the stepped landscapes in the Dugald River area record a progressive downward migration of a relatively flat weathering front. The steps in the landscape result from differential erosion of previously weathered bedrock displaying different susceptibility to weathering and contrasting resistance to erosion. Combined, the age versus elevation relationships measured yield a weathering rate of 3.8 m. Myr−1 (for the past 15 Ma) if a descending subhorizontal weathering front is assumed. The results also permit the calculation of the erosion rate of the more easily weathered and eroded lithologies, assuming an initially flat landscape as proposed in models of episodic landscape development. The average erosion rate for the past 15 Ma is 3.3 m. Myr−1, consistent with erosion rates obtained by cosmogenic isotope studies in the region.
Resumo:
Market-based transmission expansion planning gives information to investors on where is the most cost efficient place to invest and brings benefits to those who invest in this grid. However, both market issue and power system adequacy problems are system planers’ concern. In this paper, a hybrid probabilistic criterion of Expected Economical Loss (EEL) is proposed as an index to evaluate the systems’ overall expected economical losses during system operation in a competitive market. It stands on both investors’ and planner’s point of view and will further improves the traditional reliability cost. By applying EEL, it is possible for system planners to obtain a clear idea regarding the transmission network’s bottleneck and the amount of losses arises from this weak point. Sequentially, it enables planners to assess the worth of providing reliable services. Also, the EEL will contain valuable information for moneymen to undertake their investment. This index could truly reflect the random behaviors of power systems and uncertainties from electricity market. The performance of the EEL index is enhanced by applying Normalized Coefficient of Probability (NCP), so it can be utilized in large real power systems. A numerical example is carried out on IEEE Reliability Test System (RTS), which will show how the EEL can predict the current system bottleneck under future operational conditions and how to use EEL as one of planning objectives to determine future optimal plans. A well-known simulation method, Monte Carlo simulation, is employed to achieve the probabilistic characteristic of electricity market and Genetic Algorithms (GAs) is used as a multi-objective optimization tool.
Resumo:
In this paper we follow the BOID (Belief, Obligation, Intention, Desire) architecture to describe agents and agent types in Defeasible Logic. We argue, in particular, that the introduction of obligations can provide a new reading of the concepts of intention and intentionality. Then we examine the notion of social agent (i.e., an agent where obligations prevail over intentions) and discuss some computational and philosophical issues related to it. We show that the notion of social agent either requires more complex computations or has some philosophical drawbacks.
Resumo:
Expansion tubes are impulse facilities capable of generating highly energetic hyper-sonic flows. This work surveys a broad range of flow conditions produced in the facility X1 with carbon dioxide test gas, for simulation of spacecraft entry into the Martian atmosphere. Conditions with nominal flow speeds of 7, 9, 11 and 13 km/s were tested. The freestream conditions were calibrated using static/Pitot pressure measurements and advanced optical diagnostics. An extensive set of holographic interferometry experiments was performed on flows over wedges for quantitative study of freestream and post-shock densities, and post-shock ionisation. A one-dimensional code with frozen and equilibrium chemistry capabilities was used to estimate the freestream conditions. An equilibrium chemistry model produced a good match to measured freestream quantities at the high enthalpy conditions which are a major aim of this facility's operation. The freestream in the lower enthalpy conditions was found to be heavily influenced by chemical non-equilibrium. Non-equilibrium in the final unsteady expansion process of flow generation was accounted for by switching from equilibrium to frozen chemistry at a predetermined point. Comparison between the freestream density results of holographic interferometry, pressure measurements and computations shows good agreement.
Resumo:
One of the challenges in scientific visualization is to generate software libraries suitable for the large-scale data emerging from tera-scale simulations and instruments. We describe the efforts currently under way at SDSC and NPACI to address these challenges. The scope of the SDSC project spans data handling, graphics, visualization, and scientific application domains. Components of the research focus on the following areas: intelligent data storage, layout and handling, using an associated “Floor-Plan” (meta data); performance optimization on parallel architectures; extension of SDSC’s scalable, parallel, direct volume renderer to allow perspective viewing; and interactive rendering of fractional images (“imagelets”), which facilitates the examination of large datasets. These concepts are coordinated within a data-visualization pipeline, which operates on component data blocks sized to fit within the available computing resources. A key feature of the scheme is that the meta data, which tag the data blocks, can be propagated and applied consistently. This is possible at the disk level, in distributing the computations across parallel processors; in “imagelet” composition; and in feature tagging. The work reflects the emerging challenges and opportunities presented by the ongoing progress in high-performance computing (HPC) and the deployment of the data, computational, and visualization Grids.
Resumo:
There are many techniques for electricity market price forecasting. However, most of them are designed for expected price analysis rather than price spike forecasting. An effective method of predicting the occurrence of spikes has not yet been observed in the literature so far. In this paper, a data mining based approach is presented to give a reliable forecast of the occurrence of price spikes. Combined with the spike value prediction techniques developed by the same authors, the proposed approach aims at providing a comprehensive tool for price spike forecasting. In this paper, feature selection techniques are firstly described to identify the attributes relevant to the occurrence of spikes. A simple introduction to the classification techniques is given for completeness. Two algorithms: support vector machine and probability classifier are chosen to be the spike occurrence predictors and are discussed in details. Realistic market data are used to test the proposed model with promising results.
Resumo:
This paper reviews the potential use of three types of spatial technology to land managers, namely satellite imagery, satellite positioning systems and supporting computer software. Developments in remote sensing and the relative advantages of multispectral and hyperspectral images are discussed. The main challenge to the wider use of remote sensing as a land management tool is seen as uncertainty whether apparent relationships between biophysical variables and spectral reflectance are direct and causal, or artefacts of particular images. Developments in satellite positioning systems are presented in the context of land managers’ need for position estimates in situations where absolute precision may or may not be required. The role of computer software in supporting developments in spatial technology is described. Spatial technologies are seen as having matured beyond empirical applications to the stage where they are useful and reliable land management tools. In addition, computer software has become more user-friendly and this has facilitated data collection and manipulation by semi-expert as well as specialist staff.