949 resultados para Visualization Of Interval Methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Oxidation of proteins has received a lot of attention in the last decades due to the fact that they have been shown to accumulate and to be implicated in the progression and the patho-physiology of several diseases such as Alzheimer, coronary heart diseases, etc. This has also resulted in the fact that research scientist became more eager to be able to measure accurately the level of oxidized protein in biological materials, and to determine the precise site of the oxidative attack on the protein, in order to get insights into the molecular mechanisms involved in the progression of diseases. Several methods for measuring protein carbonylation have been implemented in different laboratories around the world. However, to date no methods prevail as the most accurate, reliable and robust. The present paper aims at giving an overview of the common methods used to determine protein carbonylation in biological material as well as to highlight the limitations and the potential. The ultimate goal is to give quick tips for a rapid decision making when a method has to be selected and taking into consideration the advantage and drawback of the methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

* The work is partially supported by Grant no. NIP917 of the Ministry of Science and Education – Republic of Bulgaria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we give su±cient conditions for k-th approximations of the polynomial roots of f(x) when the Maehly{Aberth{Ehrlich, Werner-Borsch-Supan, Tanabe, Improved Borsch-Supan iteration methods fail on the next step. For these methods all non-attractive sets are found. This is a subsequent improvement of previously developed techniques and known facts. The users of these methods can use the results presented here for software implementation in Distributed Applications and Simulation Environ- ments. Numerical examples with graphics are shown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 65H10.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In finance risk capital allocation raises important questions both from theoretical and practical points of view. How to share risk of a portfolio among its subportfolios? How to reserve capital in order to hedge existing risk and how to assign this to different business units? We use an axiomatic approach to examine risk capital allocation, that is we call for fundamental properties of the methods. Our starting point is Csóka and Pintér (2011) who show by generalizing Young (1985)'s axiomatization of the Shapley value that the requirements of Core Compatibility, Equal Treatment Property and Strong Monotonicity are irreconcilable given that risk is quantified by a coherent measure of risk. In this paper we look at these requirements using analytic and simulations tools. We examine allocation methods used in practice and also ones which are theoretically interesting. Our main result is that the problem raised by Csóka and Pintér (2011) is indeed relevant in practical applications, that is it is not only a theoretical problem. We also believe that through the characterizations of the examined methods our paper can serve as a useful guide for practitioners.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the frequency with which fevers occur in children ages 1–3 years, lack of knowledge and understanding about the implications of fever and methods of fever management often results in anxiety among caretakers, sometimes prompting them to seek help at nearby emergency departments. Caretakers often look to health care professionals for advice and guidance over the telephone. The purpose of this study was to investigate caretakers' knowledge of the implications of fever, methods of fever management, perceptions of pediatric telephone triage and advice services regarding fever, and the effectiveness of after hour telephone triage directed toward improving the caretakers' ability to manage their child's fever at home. Pre-triage questionnaires were completed by 72 caretakers over the telephone before the triage encounter. Twenty-two of those same caretakers whose children were triaged using the fever guideline completed and returned the mailed post-triage questionnaire. Descriptive statistics were used to analyze responses for the larger pre-intervention group and describe comparisons for the pre and post-triage responses in the smaller sample subset (n = 22). ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation documents the everyday lives and spaces of a population of youth typically constructed as out of place, and the broader urban context in which they are rendered as such. Thirty-three female and transgender street youth participated in the development of this youth-based participatory action research (YPAR) project utilizing geo-ethnographic methods, auto-photography, and archival research throughout a six-phase, eighteen-month research process in Bogotá, Colombia. ^ This dissertation details the participatory writing process that enabled the YPAR research team to destabilize dominant representations of both street girls and urban space and the participatory mapping process that enabled the development of a youth vision of the city through cartographic images. The maps display individual and aggregate spatial data indicating trends within and making comparisons between three subgroups of the research population according to nine spatial variables. These spatial data, coupled with photographic and ethnographic data, substantiate that street girls’ mobilities and activity spaces intersect with and are altered by state-sponsored urban renewal projects and paramilitary-led social cleansing killings, both efforts to clean up Bogotá by purging the city center of deviant populations and places. ^ Advancing an ethical approach to conducting research with excluded populations, this dissertation argues for the enactment of critical field praxis and care ethics within a YPAR framework to incorporate young people as principal research actors rather than merely voices represented in adultist academic discourse. Interjection of considerations of space, gender, and participation into the study of street youth produce new ways of envisioning the city and the role of young people in research. Instead of seeing the city from a panoptic view, Bogotá is revealed through the eyes of street youth who participated in the construction and feminist visualization of a new cartography and counter-map of the city grounded in embodied, situated praxis. This dissertation presents a socially responsible approach to conducting action-research with high-risk youth by documenting how street girls reclaim their right to the city on paper and in practice; through maps of their everyday exclusion in Bogotá followed by activism to fight against it.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gene-based tests of association are frequently applied to common SNPs (MAF>5%) as an alternative to single-marker tests. In this analysis we conduct a variety of simulation studies applied to five popular gene-based tests investigating general trends related to their performance in realistic situations. In particular, we focus on the impact of non-causal SNPs and a variety of LD structures on the behavior of these tests. Ultimately, we find that non-causal SNPs can significantly impact the power of all gene-based tests. On average, we find that the “noise” from 6–12 non-causal SNPs will cancel out the “signal” of one causal SNP across five popular gene-based tests. Furthermore, we find complex and differing behavior of the methods in the presence of LD within and between non-causal and causal SNPs. Ultimately, better approaches for a priori prioritization of potentially causal SNPs (e.g., predicting functionality of non-synonymous SNPs), application of these methods to sequenced or fully imputed datasets, and limited use of window-based methods for assigning inter-genic SNPs to genes will improve power. However, significant power loss from non-causal SNPs may remain unless alternative statistical approaches robust to the inclusion of non-causal SNPs are developed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Microarray platforms have been around for many years and while there is a rise of new technologies in laboratories, microarrays are still prevalent. When it comes to the analysis of microarray data to identify differentially expressed (DE) genes, many methods have been proposed and modified for improvement. However, the most popular methods such as Significance Analysis of Microarrays (SAM), samroc, fold change, and rank product are far from perfect. When it comes down to choosing which method is most powerful, it comes down to the characteristics of the sample and distribution of the gene expressions. The most practiced method is usually SAM or samroc but when the data tends to be skewed, the power of these methods decrease. With the concept that the median becomes a better measure of central tendency than the mean when the data is skewed, the tests statistics of the SAM and fold change methods are modified in this thesis. This study shows that the median modified fold change method improves the power for many cases when identifying DE genes if the data follows a lognormal distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Involuntary job loss is a major life event associated with social, economic, behavioural, and health outcomes, for which older workers are at elevated risk. OBJECTIVE: To assess the 10 year risk of myocardial infarction (MI) and stroke associated with involuntary job loss among workers over 50 years of age. METHODS: Analysing data from the nationally representative US Health and Retirement Survey (HRS), Cox proportional hazards analysis was used to estimate whether workers who suffered involuntary job loss were at higher risk for subsequent MI and stroke than individuals who continued to work. The sample included 4301 individuals who were employed at the 1992 study baseline. RESULTS: Over the 10 year study frame, 582 individuals (13.5% of the sample) experienced involuntary job loss. After controlling for established predictors of the outcomes, displaced workers had a more than twofold increase in the risk of subsequent MI (hazard ratio (HR) = 2.48; 95% confidence interval (CI) = 1.49 to 4.14) and stroke (HR = 2.43; 95% CI = 1.18 to 4.98) relative to working persons. CONCLUSION: Results suggest that the true costs of late career unemployment exceed financial deprivation, and include substantial health consequences. Physicians who treat individuals who lose jobs as they near retirement should consider the loss of employment a potential risk factor for adverse vascular health changes. Policy makers and programme planners should also be aware of the risks of job loss, so that programmatic interventions can be designed and implemented to ease the multiple burdens of joblessness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Humans and animals have remarkable capabilities in keeping time and using time as a guide to orient their learning and decision making. Psychophysical models of timing and time perception have been proposed for decades and have received behavioral, anatomical and pharmacological data support. However, despite numerous studies that aimed at delineating the neural underpinnings of interval timing, a complete picture of the neurobiological network of timing in the seconds-to-minutes range remains elusive. Based on classical interval timing protocols and proposing a Timing, Immersive Memory and Emotional Regulation (TIMER) test battery, the author investigates the contributions of the dorsal and ventral hippocampus as well as the dorsolateral and the dorsomedial striatum to interval timing by comparing timing performances in mice after they received cytotoxic lesions in the corresponding brain regions. On the other hand, a timing-based theoretical framework for the emergence of conscious experience that is closely related to the function of the claustrum is proposed so as to serve both biological guidance and the research and evolution of “strong” artificial intelligence. Finally, a new “Double Saturation Model of Interval Timing” that integrates the direct- and indirect- pathways of striatum is proposed to explain the set of empirical findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The American College of Cardiology guidelines recommend 3 months of anticoagulation after replacement of the aortic valve with a bioprosthesis. However, there remains great variability in the current clinical practice and conflicting results from clinical studies. To assist clinical decision making, we pooled the existing evidence to assess whether anticoagulation in the setting of a new bioprosthesis was associated with improved outcomes or greater risk of bleeding. METHODS AND RESULTS: We searched the PubMed database from the inception of these databases until April 2015 to identify original studies (observational studies or clinical trials) that assessed anticoagulation with warfarin in comparison with either aspirin or no antiplatelet or anticoagulant therapy. We included the studies if their outcomes included thromboembolism or stroke/transient ischemic attacks and bleeding events. Quality assessment was performed in accordance with the Newland Ottawa Scale, and random effects analysis was used to pool the data from the available studies. I(2) testing was done to assess the heterogeneity of the included studies. After screening through 170 articles, a total of 13 studies (cases=6431; controls=18210) were included in the final analyses. The use of warfarin was associated with a significantly increased risk of overall bleeding (odds ratio, 1.96; 95% confidence interval, 1.25-3.08; P<0.0001) or bleeding risk at 3 months (odds ratio, 1.92; 95% confidence interval, 1.10-3.34; P<0.0001) compared with aspirin or placebo. With regard to composite primary outcome variables (risk of venous thromboembolism, stroke, or transient ischemic attack) at 3 months, no significant difference was seen with warfarin (odds ratio, 1.13; 95% confidence interval, 0.82-1.56; P=0.67). Moreover, anticoagulation was also not shown to improve outcomes at time interval >3 months (odds ratio, 1.12; 95% confidence interval, 0.80-1.58; P=0.79). CONCLUSIONS: Contrary to the current guidelines, a meta-analysis of previous studies suggests that anticoagulation in the setting of an aortic bioprosthesis significantly increases bleeding risk without a favorable effect on thromboembolic events. Larger, randomized controlled studies should be performed to further guide this clinical practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Minimally-invasive microsurgery has resulted in improved outcomes for patients. However, operating through a microscope limits depth perception and fixes the visual perspective, which result in a steep learning curve to achieve microsurgical proficiency. We introduce a surgical imaging system employing four-dimensional (live volumetric imaging through time) microscope-integrated optical coherence tomography (4D MIOCT) capable of imaging at up to 10 volumes per second to visualize human microsurgery. A custom stereoscopic heads-up display provides real-time interactive volumetric feedback to the surgeon. We report that 4D MIOCT enhanced suturing accuracy and control of instrument positioning in mock surgical trials involving 17 ophthalmic surgeons. Additionally, 4D MIOCT imaging was performed in 48 human eye surgeries and was demonstrated to successfully visualize the pathology of interest in concordance with preoperative diagnosis in 93% of retinal surgeries and the surgical site of interest in 100% of anterior segment surgeries. In vivo 4D MIOCT imaging revealed sub-surface pathologic structures and instrument-induced lesions that were invisible through the operating microscope during standard surgical maneuvers. In select cases, 4D MIOCT guidance was necessary to resolve such lesions and prevent post-operative complications. Our novel surgical visualization platform achieves surgeon-interactive 4D visualization of live surgery which could expand the surgeon's capabilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visualization and interpretation of geological observations into a cohesive geological model are essential to Earth sciences and related fields. Various emerging technologies offer approaches to multi-scale visualization of heterogeneous data, providing new opportunities that facilitate model development and interpretation processes. These include increased accessibility to 3D scanning technology, global connectivity, and Web-based interactive platforms. The geological sciences and geological engineering disciplines are adopting these technologies as volumes of data and physical samples greatly increase. However, a standardized and universally agreed upon workflow and approach have yet to properly be developed. In this thesis, the 3D scanning workflow is presented as a foundation for a virtual geological database. This database provides augmented levels of tangibility to students and researchers who have little to no access to locations that are remote or inaccessible. A Web-GIS platform was utilized jointly with customized widgets developed throughout the course of this research to aid in visualizing hand-sized/meso-scale geological samples within a geologic and geospatial context. This context is provided as a macro-scale GIS interface, where geophysical and geodetic images and data are visualized. Specifically, an interactive interface is developed that allows for simultaneous visualization to improve the understanding of geological trends and relationships. These developed tools will allow for rapid data access and global sharing, and will facilitate comprehension of geological models using multi-scale heterogeneous observations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Major food adulteration and contamination events occur with alarming regularity and are known to be episodic, with the question being not if but when another large-scale food safety/integrity incident will occur. Indeed, the challenges of maintaining food security are now internationally recognised. The ever increasing scale and complexity of food supply networks can lead to them becoming significantly more vulnerable to fraud and contamination, and potentially dysfunctional. This can make the task of deciding which analytical methods are more suitable to collect and analyse (bio)chemical data within complex food supply chains, at targeted points of vulnerability, that much more challenging. It is evident that those working within and associated with the food industry are seeking rapid, user-friendly methods to detect food fraud and contamination, and rapid/high-throughput screening methods for the analysis of food in general. In addition to being robust and reproducible, these methods should be portable and ideally handheld and/or remote sensor devices, that can be taken to or be positioned on/at-line at points of vulnerability along complex food supply networks and require a minimum amount of background training to acquire information rich data rapidly (ergo point-and-shoot). Here we briefly discuss a range of spectrometry and spectroscopy based approaches, many of which are commercially available, as well as other methods currently under development. We discuss a future perspective of how this range of detection methods in the growing sensor portfolio, along with developments in computational and information sciences such as predictive computing and the Internet of Things, will together form systems- and technology-based approaches that significantly reduce the areas of vulnerability to food crime within food supply chains. As food fraud is a problem of systems and therefore requires systems level solutions and thinking.