933 resultados para Visualization Of Interval Methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modelling the interaction of terahertz(THz) radiation with biological tissueposes many interesting problems. THzradiation is neither obviously described byan electric field distribution or anensemble of photons and biological tissueis an inhomogeneous medium with anelectronic permittivity that is bothspatially and frequency dependent making ita complex system to model.A three-layer system of parallel-sidedslabs has been used as the system throughwhich the passage of THz radiation has beensimulated. Two modelling approaches havebeen developed a thin film matrix model anda Monte Carlo model. The source data foreach of these methods, taken at the sametime as the data recorded to experimentallyverify them, was a THz spectrum that hadpassed though air only.Experimental verification of these twomodels was carried out using athree-layered in vitro phantom. Simulatedtransmission spectrum data was compared toexperimental transmission spectrum datafirst to determine and then to compare theaccuracy of the two methods. Goodagreement was found, with typical resultshaving a correlation coefficient of 0.90for the thin film matrix model and 0.78 forthe Monte Carlo model over the full THzspectrum. Further work is underway toimprove the models above 1 THz.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The DNA G-qadruplexes are one of the targets being actively explored for anti-cancer therapy by inhibiting them through small molecules. This computational study was conducted to predict the binding strengths and orientations of a set of novel dimethyl-amino-ethyl-acridine (DACA) analogues that are designed and synthesized in our laboratory, but did not diffract in Synchrotron light.Thecrystal structure of DNA G-Quadruplex(TGGGGT)4(PDB: 1O0K) was used as target for their binding properties in our studies.We used both the force field (FF) and QM/MM derived atomic charge schemes simultaneously for comparing the predictions of drug binding modes and their energetics. This study evaluates the comparative performance of fixed point charge based Glide XP docking and the quantum polarized ligand docking schemes. These results will provide insights on the effects of including or ignoring the drug-receptor interfacial polarization events in molecular docking simulations, which in turn, will aid the rational selection of computational methods at different levels of theory in future drug design programs. Plenty of molecular modelling tools and methods currently exist for modelling drug-receptor or protein-protein, or DNA-protein interactionssat different levels of complexities.Yet, the capasity of such tools to describevarious physico-chemical propertiesmore accuratelyis the next step ahead in currentresearch.Especially, the usage of most accurate methods in quantum mechanics(QM) is severely restricted by theirtedious nature. Though the usage of massively parallel super computing environments resulted in a tremendous improvement in molecular mechanics (MM) calculations like molecular dynamics,they are still capable of dealing with only a couple of tens to hundreds of atoms for QM methods. One such efficient strategy that utilizes thepowers of both MM and QM are the QM/MM hybrid methods. Lately, attempts have been directed towards the goal of deploying several different QM methods for betterment of force field based simulations, but with practical restrictions in place. One of such methods utilizes the inclusion of charge polarization events at the drug-receptor interface, that is not explicitly present in the MM FF.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background: The analysis of the Auditory Brainstem Response (ABR) is of fundamental importance to the investigation of the auditory system behaviour, though its interpretation has a subjective nature because of the manual process employed in its study and the clinical experience required for its analysis. When analysing the ABR, clinicians are often interested in the identification of ABR signal components referred to as Jewett waves. In particular, the detection and study of the time when these waves occur (i.e., the wave latency) is a practical tool for the diagnosis of disorders affecting the auditory system. Significant differences in inter-examiner results may lead to completely distinct clinical interpretations of the state of the auditory system. In this context, the aim of this research was to evaluate the inter-examiner agreement and variability in the manual classification of ABR. Methods: A total of 160 ABR data samples were collected, for four different stimulus intensity (80dBHL, 60dBHL, 40dBHL and 20dBHL), from 10 normal-hearing subjects (5 men and 5 women, from 20 to 52 years). Four examiners with expertise in the manual classification of ABR components participated in the study. The Bland-Altman statistical method was employed for the assessment of inter-examiner agreement and variability. The mean, standard deviation and error for the bias, which is the difference between examiners’ annotations, were estimated for each pair of examiners. Scatter plots and histograms were employed for data visualization and analysis. Results: In most comparisons the differences between examiner’s annotations were below 0.1 ms, which is clinically acceptable. In four cases, it was found a large error and standard deviation (>0.1 ms) that indicate the presence of outliers and thus, discrepancies between examiners. Conclusions: Our results quantify the inter-examiner agreement and variability of the manual analysis of ABR data, and they also allows for the determination of different patterns of manual ABR analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this article is to improve the communication of the probabilistic flood forecasts generated by hydrological ensemble prediction systems (HEPS) by understanding perceptions of different methods of visualizing probabilistic forecast information. This study focuses on interexpert communication and accounts for differences in visualization requirements based on the information content necessary for individual users. The perceptions of the expert group addressed in this study are important because they are the designers and primary users of existing HEPS. Nevertheless, they have sometimes resisted the release of uncertainty information to the general public because of doubts about whether it can be successfully communicated in ways that would be readily understood to nonexperts. In this article, we explore the strengths and weaknesses of existing HEPS visualization methods and thereby formulate some wider recommendations about the best practice for HEPS visualization and communication. We suggest that specific training on probabilistic forecasting would foster use of probabilistic forecasts with a wider range of applications. The result of a case study exercise showed that there is no overarching agreement between experts on how to display probabilistic forecasts and what they consider the essential information that should accompany plots and diagrams. In this article, we propose a list of minimum properties that, if consistently displayed with probabilistic forecasts, would make the products more easily understandable. Copyright © 2012 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We systematically compare the performance of ETKF-4DVAR, 4DVAR-BEN and 4DENVAR with respect to two traditional methods (4DVAR and ETKF) and an ensemble transform Kalman smoother (ETKS) on the Lorenz 1963 model. We specifically investigated this performance with increasing nonlinearity and using a quasi-static variational assimilation algorithm as a comparison. Using the analysis root mean square error (RMSE) as a metric, these methods have been compared considering (1) assimilation window length and observation interval size and (2) ensemble size to investigate the influence of hybrid background error covariance matrices and nonlinearity on the performance of the methods. For short assimilation windows with close to linear dynamics, it has been shown that all hybrid methods show an improvement in RMSE compared to the traditional methods. For long assimilation window lengths in which nonlinear dynamics are substantial, the variational framework can have diffculties fnding the global minimum of the cost function, so we explore a quasi-static variational assimilation (QSVA) framework. Of the hybrid methods, it is seen that under certain parameters, hybrid methods which do not use a climatological background error covariance do not need QSVA to perform accurately. Generally, results show that the ETKS and hybrid methods that do not use a climatological background error covariance matrix with QSVA outperform all other methods due to the full flow dependency of the background error covariance matrix which also allows for the most nonlinearity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intracellular reactive oxygen species (ROS) production is essential to normal cell function. However, excessive ROS production causes oxidative damage and cell death. Many pharmacological compounds exert their effects on cell cycle progression by changing intracellular redox state and in many cases cause oxidative damage leading to drug cytotoxicity. Appropriate measurement of intracellular ROS levels during cell cycle progression is therefore crucial in understanding redox-regulation of cell function and drug toxicity and for the development of new drugs. However, due to the extremely short half-life of ROS, measuring the changes in intracellular ROS levels during a particular phase of cell cycle for drug intervention can be challenging. In this article, we have provided updated information on the rationale, the applications, the advantages and limitations of common methods for screening drug effects on intracellular ROS production linked to cell cycle study. Our aim is to facilitate biomedical scientists and researchers in the pharmaceutical industry in choosing or developing specific experimental regimens to suit their research needs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In spite of numerous, substantial advances in equine reproduction, many stages of embryonic and fetal morphological development are poorly understood, with no apparent single source of comprehensive information. Hence, the objective of the present study was to provide a complete macroscopic and microscopic description of the equine embryo/fetus at various gestational ages. Thirty-four embryos/fetuses were aged based on their crown rump length (CRL), and submitted to macroscopic description, biometry, light and scanning microscopy, as well as the alizarin technique. All observed developmental changes were chronologically ordered and described. As examples of the main observed features, an accentuated cervical curvature was observed upon macroscopic examination in all specimens. In the nervous system, the encephalic fourth ventricle and the encephalic vesicles forebrain, midbrain, and hindbrain, were visualized from Day 19 (ovulation = Day 0). The thoracic and pelvic limbs were also visualized; their extremities gave rise to the hoof during development from Day 27. Development of other structures such as pigmented optical vesicle, liver, tail, cardiac area, lungs, and dermal vascularization started on Days 25, 25, 19, 19, 34, and 35, respectively. Light and scanning microscopy facilitated detailed examinations of several organs, e.g., heart, kidneys, lungs, and intestine, whereas the alizarin technique enabled visualization of ossification. Observations in this study contributed to the knowledge regarding equine embryogenesis, and included much detailed data from many specimens collected over a long developmental interval. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The quality control optimization of medical processes that use ionizing radiation in the treatment of diseases like cancer is a key element for patient safety and success of treatment. The major medical application of radiation is radiotherapy, i.e. the delivery of dose levels to well-defined target tissues of a patient with the purpose of eliminating a disease. The need of an accurate tumour-edge definition with the purpose of preserving healthy surrounding tissue demands rigorous radiation treatment planning. Dosimetric methods are used for dose distribution mapping region of interest to assure that the prescribed dose and the irradiated region are correct. The Fricke gel (FXG) is the main dosimeter that supplies visualization of the three-dimensional (3D) dose distribution. In this work the dosimetric characteristics of the modified Fricke dosimeter produced at the Radiation Metrology Centre of the Institute of Energetic and Nuclear Research (IPEN) such as gel concentration dose response dependence, xylenol orange addition influence, dose response between 5 and 50Gy and signal stability were evaluated by magnetic resonance imaging (MRI). Using the same gel solution, breast simulators (phantoms) were shaped and absorbed dose distributions were imaged by MRI at the Nuclear Resonance Laboratory of the Physics Institute of Sao Paulo University. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PhotogemA (R) is a hematoporphyrin derivative that has been used as a photosensitizer in experimental and clinical Photodynamic Therapy (PDT) in Brazil. Photosensitizers are degraded under illumination. This process, usually called photobleaching, can be monitored by decreasing in fluorescence intensities and includes the following photoprocesses: photodegradation, phototransformation, and photorelocalization. Photobleaching of hematoporphyrin-type sensitizers during illumination in aqueous solution is related not only to photodegradation but is also followed by the formation of photoproducts with a new fluorescence band at around 640-650 nm and with increased light absorption in the red spectral region at 640 nm. In this study, the influence of pH on the phototransformation process was investigated. PhotogemA (R) solutions, 40 mu g/ml, were irradiated at 514 nm with intensity of 100 mW/cm(2) for 20 min with different pH environments. The controls were performed with the samples in the absence of light. The PhotogemA (R) photodegradation is dependent on the pH. The behavior of photodegradation and photoproducts formation (monitored at 640 nm) is distinct and depends on the photosensitizer concentration. The processes of degradation and photoproducts formation were monitored with Photogemin the concentration of 40 mu g/mL since that demonstrated the best visualization of both processes. While below pH 5 the photodegradation occurred, there was no detectable presence of photoproducts. The increase of pH led to increase of photoproducts formation rate with photodegradation reaching the highest value at pH 10. The increase of photoproducts formation and instability of PhotogemA (R) from pH 6 to pH 10 are in agreement with the desired properties of an ideal photosensitizer since there are significant differences in pH between normal (7.0 < pH < 8.6) and tumor (5.8 < pH < 7.9) tissues. It is important to know the effect of pH in the process of phototransformation (degradation and photoproduct formation) of the molecule since low pH values promotes increase in the proportion of aggregates species in solution and high pH values promotes increase in the proportion of monomeric species. There must be an ideal pH interval which favors the phototransformation process that is correlated with the singlet oxygen formation responsible by the photodynamic effect. These differences in pH between normal and tumor cells can explain the presence of photosensitizers in target tumor cells, making PDT a selective therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quadratic assignment problems (QAPs) are commonly solved by heuristic methods, where the optimum is sought iteratively. Heuristics are known to provide good solutions but the quality of the solutions, i.e., the confidence interval of the solution is unknown. This paper uses statistical optimum estimation techniques (SOETs) to assess the quality of Genetic algorithm solutions for QAPs. We examine the functioning of different SOETs regarding biasness, coverage rate and length of interval, and then we compare the SOET lower bound with deterministic ones. The commonly used deterministic bounds are confined to only a few algorithms. We show that, the Jackknife estimators have better performance than Weibull estimators, and when the number of heuristic solutions is as large as 100, higher order JK-estimators perform better than lower order ones. Compared with the deterministic bounds, the SOET lower bound performs significantly better than most deterministic lower bounds and is comparable with the best deterministic ones. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To design, develop and set up a web-based system for enabling graphical visualization of upper limb motor performance (ULMP) of Parkinson’s disease (PD) patients to clinicians. Background Sixty-five patients diagnosed with advanced PD have used a test battery, implemented in a touch-screen handheld computer, in their home environment settings over the course of a 3-year clinical study. The test items consisted of objective measures of ULMP through a set of upper limb motor tests (finger to tapping and spiral drawings). For the tapping tests, patients were asked to perform alternate tapping of two buttons as fast and accurate as possible, first using the right hand and then the left hand. The test duration was 20 seconds. For the spiral drawing test, patients traced a pre-drawn Archimedes spiral using the dominant hand, and the test was repeated 3 times per test occasion. In total, the study database consisted of symptom assessments during 10079 test occasions. Methods Visualization of ULMP The web-based system is used by two neurologists for assessing the performance of PD patients during motor tests collected over the course of the said study. The system employs animations, scatter plots and time series graphs to visualize the ULMP of patients to the neurologists. The performance during spiral tests is depicted by animating the three spiral drawings, allowing the neurologists to observe real-time accelerations or hesitations and sharp changes during the actual drawing process. The tapping performance is visualized by displaying different types of graphs. Information presented included distribution of taps over the two buttons, horizontal tap distance vs. time, vertical tap distance vs. time, and tapping reaction time over the test length. Assessments Different scales are utilized by the neurologists to assess the observed impairments. For the spiral drawing performance, the neurologists rated firstly the ‘impairment’ using a 0 (no impairment) – 10 (extremely severe) scale, secondly three kinematic properties: ‘drawing speed’, ‘irregularity’ and ‘hesitation’ using a 0 (normal) – 4 (extremely severe) scale, and thirdly the probable ‘cause’ for the said impairment using 3 choices including Tremor, Bradykinesia/Rigidity and Dyskinesia. For the tapping performance, a 0 (normal) – 4 (extremely severe) scale is used for first rating four tapping properties: ‘tapping speed’, ‘accuracy’, ‘fatigue’, ‘arrhythmia’, and then the ‘global tapping severity’ (GTS). To achieve a common basis for assessment, initially one neurologist (DN) performed preliminary ratings by browsing through the database to collect and rate at least 20 samples of each GTS level and at least 33 samples of each ‘cause’ category. These preliminary ratings were then observed by the two neurologists (DN and PG) to be used as templates for rating of tests afterwards. In another track, the system randomly selected one test occasion per patient and visualized its items, that is tapping and spiral drawings, to the two neurologists. Statistical methods Inter-rater agreements were assessed using weighted Kappa coefficient. The internal consistency of properties of tapping and spiral drawing tests were assessed using Cronbach’s α test. One-way ANOVA test followed by Tukey multiple comparisons test was used to test if mean scores of properties of tapping and spiral drawing tests were different among GTS and ‘cause’ categories, respectively. Results When rating tapping graphs, inter-rater agreements (Kappa) were as follows: GTS (0.61), ‘tapping speed’ (0.89), ‘accuracy’ (0.66), ‘fatigue’ (0.57) and ‘arrhythmia’ (0.33). The poor inter-rater agreement when assessing “arrhythmia” may be as a result of observation of different things in the graphs, among the two raters. When rating animated spirals, both raters had very good agreement when assessing severity of spiral drawings, that is, ‘impairment’ (0.85) and irregularity (0.72). However, there were poor agreements between the two raters when assessing ‘cause’ (0.38) and time-information properties like ‘drawing speed’ (0.25) and ‘hesitation’ (0.21). Tapping properties, that is ‘tapping speed’, ‘accuracy’, ‘fatigue’ and ‘arrhythmia’ had satisfactory internal consistency with a Cronbach’s α coefficient of 0.77. In general, the trends of mean scores of tapping properties worsened with increasing levels of GTS. The mean scores of the four properties were significantly different to each other, only at different levels. In contrast from tapping properties, kinematic properties of spirals, that is ‘drawing speed’, ‘irregularity’ and ‘hesitation’ had a questionable consistency among them with a coefficient of 0.66. Bradykinetic spirals were associated with more impaired speed (mean = 83.7 % worse, P < 0.001) and hesitation (mean = 77.8% worse, P < 0.001), compared to dyskinetic spirals. Both these ‘cause’ categories had similar mean scores of ‘impairment’ and ‘irregularity’. Conclusions In contrast from current approaches used in clinical setting for the assessment of PD symptoms, this system enables clinicians to animate easily and realistically the ULMP of patients who at the same time are at their homes. Dynamic access of visualized motor tests may also be useful when observing and evaluating therapy-related complications such as under- and over-medications. In future, we foresee to utilize these manual ratings for developing and validating computer methods for automating the process of assessing ULMP of PD patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Grammar has always been an important part of language learning. Based on various theories, such as the universal grammar theory (Chomsky, 1959) and, the input theory (Krashen, 1970), the explicit and implicit teaching methods have been developed. Research shows that both methods may have some benefits and disadvantages. The attitude towards English grammar teaching methods in schools has also changed and nowadays grammar teaching methods and learning strategies, as a part of language mastery, are one of the discussion topics among linguists. This study focuses on teacher and learner experiences and beliefs about teaching English grammar and difficulties learners may face. The aim of the study is to conduct a literature review and to find out what scientific knowledge exists concerning the previously named topics. Along with this, the relevant steering documents are investigated focusing on grammar teaching at Swedish upper secondary schools. The universal grammar theory of Chomsky as well as Krashen’s input hypotheses provide the theoretical background for the current study. The study has been conducted applying qualitative and quantitative methods. The systematic search in four databases LIBRIS, ERIK, LLBA and Google Scholar were used for collecting relevant publications. The result shows that scientists’ publications name different grammar areas that are perceived as problematic for learners all over the world. The most common explanation of these difficulties is the influence of learner L1. Research presents teachers’ and learners’ beliefs to the benefits of grammar teaching methods. An effective combination of teaching methods needs to be done to fit learners’ expectations and individual needs. Together, they will contribute to the achieving of higher language proficiency levels and, therefore, they can be successfully applied at Swedish upper secondary schools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper estimates the impact of the use of structured methods on the quality of education of the students in primary public school in Brazil. Structure methods encompass a range of pedagogical and managerial instruments applied to the education system. In recent years, several municipalities in the State of São Paulo have contracted out private educational providers to implement these structured methods in their schooling system. Their pedagogical proposal involves structuring curriculum contents, elaboration and use of teachers and students textbooks, and training and supervision of the teachers and instructors. Using a difference in differences estimation strategy, we find that the fourth and eighth grader students in the municipalities with structured methods performed better in Portuguese and Math than students in municipalities not exposed to the methods. We find no differences in approval rates. However, a robustness check is not able to discard the possibility that unobserved municipal characteristics may affect the results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this project is to understand, under a social constructionist approach, what are the meanings that external facilitators and organizational members (sponsors) working with dialogic methods place on themselves and their work. Dialogic methods, with the objective of engaging groups in flows of conversations to envisage and co-create their own future, are growing fast within organizations as a means to achieve collective change. Sharing constructionist ideas about the possibility of multiple realities and language as constitutive of such realities, dialogue has turned into a promising way for transformation, especially in a macro context of constant change and increasing complexity, where traditional structures, relationships and forms of work are questioned. Research on the topic has mostly focused on specific methods or applications, with few attempts to study it in a broader sense. Also, despite the fact that dialogic methods work on the assumption that realities are socially constructed, few studies approach the topic from a social constructionist perspective, as a research methodology per se. Thus, while most existing research aims at explaining whether or how particular methods meet particular results, my intention is to explore the meanings sustaining these new forms of organizational practice. Data was collected through semi-structured interviews with 25 people working with dialogic methods: 11 facilitators and 14 sponsors, from 8 different organizations in Brazil. Firstly, the research findings indicate several contextual elements that seem to sustain the choices for dialogic methods. Within this context, there does not seem to be a clear or specific demand for dialogic methods, but a set of different motivations, objectives and focuses, bringing about several contrasts in the way participants name, describe and explain their experiences with such methods, including tensions on power relations, knowledge creation, identity and communication. Secondly, some central ideas or images were identified within such contrasts, pointing at both directions: dialogic methods as opportunities for the creation of new organizational realities (with images of a ‘door’ or a ‘flow’, for instance, which suggest that dialogic methods may open up the access to other perspectives and the creation of new realities); and dialogic methods as new instrumental mechanisms that seem to reproduce the traditional and non-dialogical forms of work and relationship. The individualistic tradition and its tendency for rational schematism - pointed out by social constructionist scholars as strong traditions in our Western Culture - could be observed in some participants’ accounts with the image of dialogic methods as a ‘gym’, for instance, in which dialogical – and idealized –‘abilities’ could be taught and trained, turning dialogue into a tool, rather than a means for transformation. As a conclusion, I discuss what the implications of such taken-for-granted assumptions may be, and offer some insights into dialogue (and dialogic methods) as ‘the art of being together’.