916 resultados para NATURE, HEALING POWER OF
Resumo:
Scoping behavioral variations to dynamic extents is useful to support non-functional requirements that otherwise result in cross-cutting code. Unfortunately, such variations are difficult to achieve with traditional reflection or aspects. We show that with a modification of dynamic proxies, called delegation proxies, it becomes possible to reflectively implement variations that propagate to all objects accessed in the dynamic extent of a message send. We demonstrate our approach with examples of variations scoped to dynamic extents that help simplify code related to safety, reliability, and monitoring.
Resumo:
The advent of single molecule fluorescence microscopy has allowed experimental molecular biophysics and biochemistry to transcend traditional ensemble measurements, where the behavior of individual proteins could not be precisely sampled. The recent explosion in popularity of new super-resolution and super-localization techniques coupled with technical advances in optical designs and fast highly sensitive cameras with single photon sensitivity and millisecond time resolution have made it possible to track key motions, reactions, and interactions of individual proteins with high temporal resolution and spatial resolution well beyond the diffraction limit. Within the purview of membrane proteins and ligand gated ion channels (LGICs), these outstanding advances in single molecule microscopy allow for the direct observation of discrete biochemical states and their fluctuation dynamics. Such observations are fundamentally important for understanding molecular-level mechanisms governing these systems. Examples reviewed here include the effects of allostery on the stoichiometry of ligand binding in the presence of fluorescent ligands; the observation of subdomain partitioning of membrane proteins due to microenvironment effects; and the use of single particle tracking experiments to elucidate characteristics of membrane protein diffusion and the direct measurement of thermodynamic properties, which govern the free energy landscape of protein dimerization. The review of such characteristic topics represents a snapshot of efforts to push the boundaries of fluorescence microscopy of membrane proteins to the absolute limit.
Resumo:
Video-oculography devices are now used to quantify the vestibulo-ocular reflex (VOR) at the bedside using the head impulse test (HIT). Little is known about the impact of disruptive phenomena (e.g. corrective saccades, nystagmus, fixation losses, eye-blink artifacts) on quantitative VOR assessment in acute vertigo. This study systematically characterized the frequency, nature, and impact of artifacts on HIT VOR measures. From a prospective study of 26 patients with acute vestibular syndrome (16 vestibular neuritis, 10 stroke), we classified findings using a structured coding manual. Of 1,358 individual HIT traces, 72% had abnormal disruptive saccades, 44% had at least one artifact, and 42% were uninterpretable. Physicians using quantitative recording devices to measure head impulse VOR responses for clinical diagnosis should be aware of the potential impact of disruptive eye movements and measurement artifacts.
Resumo:
Maternal thromboembolism and a spectrum of placenta-mediated complications including the pre-eclampsia syndromes, fetal growth restriction, fetal loss, and abruption manifest a shared etiopathogenesis and predisposing risk factors. Furthermore, these maternal and fetal complications are often linked to subsequent maternal health consequences that comprise the metabolic syndrome, namely, thromboembolism, chronic hypertension, and type II diabetes. Traditionally, several lines of evidence have linked vasoconstriction, excessive thrombosis and inflammation, and impaired trophoblast invasion at the uteroplacental interface as hallmark features of the placental complications. "Omic" technologies and biomarker development have been largely based upon advances in vascular biology, improved understanding of the molecular basis and biochemical pathways responsible for the clinically relevant diseases, and increasingly robust large cohort and/or registry based studies. Advances in understanding of innate and adaptive immunity appear to play an important role in several pregnancy complications. Strategies aimed at improving prediction of these pregnancy complications are often incorporating hemodynamic blood flow data using non-invasive imaging technologies of the utero-placental and maternal circulations early in pregnancy. Some evidence suggests that a multiple marker approach will yield the best performing prediction tools, which may then in turn offer the possibility of early intervention to prevent or ameliorate these pregnancy complications. Prediction of maternal cardiovascular and non-cardiovascular consequences following pregnancy represents an important area of future research, which may have significant public health consequences not only for cardiovascular disease, but also for a variety of other disorders, such as autoimmune and neurodegenerative diseases.
Resumo:
by William Kramer. Transl. from the German, with the latest improvements of the author since the last German ed. by James Risdon Bennett
Resumo:
Although the area under the receiver operating characteristic (AUC) is the most popular measure of the performance of prediction models, it has limitations, especially when it is used to evaluate the added discrimination of a new biomarker in the model. Pencina et al. (2008) proposed two indices, the net reclassification improvement (NRI) and integrated discrimination improvement (IDI), to supplement the improvement in the AUC (IAUC). Their NRI and IDI are based on binary outcomes in case-control settings, which do not involve time-to-event outcome. However, many disease outcomes are time-dependent and the onset time can be censored. Measuring discrimination potential of a prognostic marker without considering time to event can lead to biased estimates. In this dissertation, we have extended the NRI and IDI to survival analysis settings and derived the corresponding sample estimators and asymptotic tests. Simulation studies were conducted to compare the performance of the time-dependent NRI and IDI with Pencina’s NRI and IDI. For illustration, we have applied the proposed method to a breast cancer study.^ Key words: Prognostic model, Discrimination, Time-dependent NRI and IDI ^
Resumo:
Objectives. This paper seeks to assess the effect on statistical power of regression model misspecification in a variety of situations. ^ Methods and results. The effect of misspecification in regression can be approximated by evaluating the correlation between the correct specification and the misspecification of the outcome variable (Harris 2010).In this paper, three misspecified models (linear, categorical and fractional polynomial) were considered. In the first section, the mathematical method of calculating the correlation between correct and misspecified models with simple mathematical forms was derived and demonstrated. In the second section, data from the National Health and Nutrition Examination Survey (NHANES 2007-2008) were used to examine such correlations. Our study shows that comparing to linear or categorical models, the fractional polynomial models, with the higher correlations, provided a better approximation of the true relationship, which was illustrated by LOESS regression. In the third section, we present the results of simulation studies that demonstrate overall misspecification in regression can produce marked decreases in power with small sample sizes. However, the categorical model had greatest power, ranging from 0.877 to 0.936 depending on sample size and outcome variable used. The power of fractional polynomial model was close to that of linear model, which ranged from 0.69 to 0.83, and appeared to be affected by the increased degrees of freedom of this model.^ Conclusion. Correlations between alternative model specifications can be used to provide a good approximation of the effect on statistical power of misspecification when the sample size is large. When model specifications have known simple mathematical forms, such correlations can be calculated mathematically. Actual public health data from NHANES 2007-2008 were used as examples to demonstrate the situations with unknown or complex correct model specification. Simulation of power for misspecified models confirmed the results based on correlation methods but also illustrated the effect of model degrees of freedom on power.^
Resumo:
Sizes and power of selected two-sample tests of the equality of survival distributions are compared by simulation for small samples from unequally, randomly-censored exponential distributions. The tests investigated include parametric tests (F, Score, Likelihood, Asymptotic), logrank tests (Mantel, Peto-Peto), and Wilcoxon-Type tests (Gehan, Prentice). Equal sized samples, n = 18, 16, 32 with 1000 (size) and 500 (power) simulation trials, are compared for 16 combinations of the censoring proportions 0%, 20%, 40%, and 60%. For n = 8 and 16, the Asymptotic, Peto-Peto, and Wilcoxon tests perform at nominal 5% size expectations, but the F, Score and Mantel tests exceeded 5% size confidence limits for 1/3 of the censoring combinations. For n = 32, all tests showed proper size, with the Peto-Peto test most conservative in the presence of unequal censoring. Powers of all tests are compared for exponential hazard ratios of 1.4 and 2.0. There is little difference in power characteristics of the tests within the classes of tests considered. The Mantel test showed 90% to 95% power efficiency relative to parametric tests. Wilcoxon-type tests have the lowest relative power but are robust to differential censoring patterns. A modified Peto-Peto test shows power comparable to the Mantel test. For n = 32, a specific Weibull-exponential comparison of crossing survival curves suggests that the relative powers of logrank and Wilcoxon-type tests are dependent on the scale parameter of the Weibull distribution. Wilcoxon-type tests appear more powerful than logrank tests in the case of late-crossing and less powerful for early-crossing survival curves. Guidelines for the appropriate selection of two-sample tests are given. ^
Resumo:
This paper presents an analysis of the fault tolerance achieved by an autonomous, fully embedded evolvable hardware system, which uses a combination of partial dynamic reconfiguration and an evolutionary algorithm (EA). It demonstrates that the system may self-recover from both transient and cumulative permanent faults. This self-adaptive system, based on a 2D array of 16 (4×4) Processing Elements (PEs), is tested with an image filtering application. Results show that it may properly recover from faults in up to 3 PEs, that is, more than 18% cumulative permanent faults. Two fault models are used for testing purposes, at PE and CLB levels. Two self-healing strategies are also introduced, depending on whether fault diagnosis is available or not. They are based on scrubbing, fitness evaluation, dynamic partial reconfiguration and in-system evolutionary adaptation. Since most of these adaptability features are already available on the system for its normal operation, resource cost for self-healing is very low (only some code additions in the internal microprocessor core)
Resumo:
This paper presents the results of a strategy to modernise the Spanish University system through the establishment of an International Campus of Excellence (CEI). The current, ambitious but realistic, project is a joint initiative of a number of institutions located in the Moncloa Campus, amongst them the Complutense and the Technical Universities, as well as CIEMAT, CSIC and INIA. The aim of the project is to transform the Moncloa Campus into an international point of reference with regard to research, education and innovation. This paper describes the project and presents the qualitative and quantitative results.
Resumo:
Tissue P systems generalize the membrane structure tree usual in original models of P systems to an arbitrary graph. Basic opera- tions in these systems are communication rules, enriched in some variants with cell division or cell separation. Several variants of tissue P systems were recently studied, together with the concept of uniform families of these systems. Their computational power was shown to range between P and NP ? co-NP , thus characterizing some interesting borderlines between tractability and intractability. In this paper we show that com- putational power of these uniform families in polynomial time is limited by the class PSPACE . This class characterizes the power of many clas- sical parallel computing models
Resumo:
Bayesian network classifiers are widely used in machine learning because they intuitively represent causal relations. Multi-label classification problems require each instance to be assigned a subset of a defined set of h labels. This problem is equivalent to finding a multi-valued decision function that predicts a vector of h binary classes. In this paper we obtain the decision boundaries of two widely used Bayesian network approaches for building multi-label classifiers: Multi-label Bayesian network classifiers built using the binary relevance method and Bayesian network chain classifiers. We extend our previous single-label results to multi-label chain classifiers, and we prove that, as expected, chain classifiers provide a more expressive model than the binary relevance method.
Resumo:
Shading reduces the power output of a photovoltaic (PV) system. The design engineering of PV systems requires modeling and evaluating shading losses. Some PV systems are affected by complex shading scenes whose resulting PV energy losses are very difficult to evaluate with current modeling tools. Several specialized PV design and simulation software include the possibility to evaluate shading losses. They generally possess a Graphical User Interface (GUI) through which the user can draw a 3D shading scene, and then evaluate its corresponding PV energy losses. The complexity of the objects that these tools can handle is relatively limited. We have created a software solution, 3DPV, which allows evaluating the energy losses induced by complex 3D scenes on PV generators. The 3D objects can be imported from specialized 3D modeling software or from a 3D object library. The shadows cast by this 3D scene on the PV generator are then directly evaluated from the Graphics Processing Unit (GPU). Thanks to the recent development of GPUs for the video game industry, the shadows can be evaluated with a very high spatial resolution that reaches well beyond the PV cell level, in very short calculation times. A PV simulation model then translates the geometrical shading into PV energy output losses. 3DPV has been implemented using WebGL, which allows it to run directly from a Web browser, without requiring any local installation from the user. This also allows taken full benefits from the information already available from Internet, such as the 3D object libraries. This contribution describes, step by step, the method that allows 3DPV to evaluate the PV energy losses caused by complex shading. We then illustrate the results of this methodology to several application cases that are encountered in the world of PV systems design. Keywords: 3D, modeling, simulation, GPU, shading, losses, shadow mapping, solar, photovoltaic, PV, WebGL