906 resultados para Timed and Probabilistic Automata


Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: Many parasitic organisms, eukaryotes as well as bacteria, possess surface antigens with amino acid repeats. Making up the interface between host and pathogen such repetitive proteins may be virulence factors involved in immune evasion or cytoadherence. They find immunological applications in serodiagnostics and vaccine development. Here we use proteins which contain perfect repeats as a basis for comparative genomics between parasitic and free-living organisms. RESULTS: We have developed Reptile http://reptile.unibe.ch, a program for proteome-wide probabilistic description of perfect repeats in proteins. Parasite proteomes exhibited a large variance regarding the proportion of repeat-containing proteins. Interestingly, there was a good correlation between the percentage of highly repetitive proteins and mean protein length in parasite proteomes, but not at all in the proteomes of free-living eukaryotes. Reptile combined with programs for the prediction of transmembrane domains and GPI-anchoring resulted in an effective tool for in silico identification of potential surface antigens and virulence factors from parasites. CONCLUSION: Systemic surveys for perfect amino acid repeats allowed basic comparisons between free-living and parasitic organisms that were directly applicable to predict proteins of serological and parasitological importance. An on-line tool is available at http://genomics.unibe.ch/dora.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The municipality of San Juan La Laguna, Guatemala is home to approximately 5,200 people and located on the western side of the Lake Atitlán caldera. Steep slopes surround all but the eastern side of San Juan. The Lake Atitlán watershed is susceptible to many natural hazards, but most predictable are the landslides that can occur annually with each rainy season, especially during high-intensity events. Hurricane Stan hit Guatemala in October 2005; the resulting flooding and landslides devastated the Atitlán region. Locations of landslide and non-landslide points were obtained from field observations and orthophotos taken following Hurricane Stan. This study used data from multiple attributes, at every landslide and non-landslide point, and applied different multivariate analyses to optimize a model for landslides prediction during high-intensity precipitation events like Hurricane Stan. The attributes considered in this study are: geology, geomorphology, distance to faults and streams, land use, slope, aspect, curvature, plan curvature, profile curvature and topographic wetness index. The attributes were pre-evaluated for their ability to predict landslides using four different attribute evaluators, all available in the open source data mining software Weka: filtered subset, information gain, gain ratio and chi-squared. Three multivariate algorithms (decision tree J48, logistic regression and BayesNet) were optimized for landslide prediction using different attributes. The following statistical parameters were used to evaluate model accuracy: precision, recall, F measure and area under the receiver operating characteristic (ROC) curve. The algorithm BayesNet yielded the most accurate model and was used to build a probability map of landslide initiation points. The probability map developed in this study was also compared to the results of a bivariate landslide susceptibility analysis conducted for the watershed, encompassing Lake Atitlán and San Juan. Landslides from Tropical Storm Agatha 2010 were used to independently validate this study’s multivariate model and the bivariate model. The ultimate aim of this study is to share the methodology and results with municipal contacts from the author's time as a U.S. Peace Corps volunteer, to facilitate more effective future landslide hazard planning and mitigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sensor networks have been an active research area in the past decade due to the variety of their applications. Many research studies have been conducted to solve the problems underlying the middleware services of sensor networks, such as self-deployment, self-localization, and synchronization. With the provided middleware services, sensor networks have grown into a mature technology to be used as a detection and surveillance paradigm for many real-world applications. The individual sensors are small in size. Thus, they can be deployed in areas with limited space to make unobstructed measurements in locations where the traditional centralized systems would have trouble to reach. However, there are a few physical limitations to sensor networks, which can prevent sensors from performing at their maximum potential. Individual sensors have limited power supply, the wireless band can get very cluttered when multiple sensors try to transmit at the same time. Furthermore, the individual sensors have limited communication range, so the network may not have a 1-hop communication topology and routing can be a problem in many cases. Carefully designed algorithms can alleviate the physical limitations of sensor networks, and allow them to be utilized to their full potential. Graphical models are an intuitive choice for designing sensor network algorithms. This thesis focuses on a classic application in sensor networks, detecting and tracking of targets. It develops feasible inference techniques for sensor networks using statistical graphical model inference, binary sensor detection, events isolation and dynamic clustering. The main strategy is to use only binary data for rough global inferences, and then dynamically form small scale clusters around the target for detailed computations. This framework is then extended to network topology manipulation, so that the framework developed can be applied to tracking in different network topology settings. Finally the system was tested in both simulation and real-world environments. The simulations were performed on various network topologies, from regularly distributed networks to randomly distributed networks. The results show that the algorithm performs well in randomly distributed networks, and hence requires minimum deployment effort. The experiments were carried out in both corridor and open space settings. A in-home falling detection system was simulated with real-world settings, it was setup with 30 bumblebee radars and 30 ultrasonic sensors driven by TI EZ430-RF2500 boards scanning a typical 800 sqft apartment. Bumblebee radars are calibrated to detect the falling of human body, and the two-tier tracking algorithm is used on the ultrasonic sensors to track the location of the elderly people.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wind energy has been one of the most growing sectors of the nation’s renewable energy portfolio for the past decade, and the same tendency is being projected for the upcoming years given the aggressive governmental policies for the reduction of fossil fuel dependency. Great technological expectation and outstanding commercial penetration has shown the so called Horizontal Axis Wind Turbines (HAWT) technologies. Given its great acceptance, size evolution of wind turbines over time has increased exponentially. However, safety and economical concerns have emerged as a result of the newly design tendencies for massive scale wind turbine structures presenting high slenderness ratios and complex shapes, typically located in remote areas (e.g. offshore wind farms). In this regard, safety operation requires not only having first-hand information regarding actual structural dynamic conditions under aerodynamic action, but also a deep understanding of the environmental factors in which these multibody rotating structures operate. Given the cyclo-stochastic patterns of the wind loading exerting pressure on a HAWT, a probabilistic framework is appropriate to characterize the risk of failure in terms of resistance and serviceability conditions, at any given time. Furthermore, sources of uncertainty such as material imperfections, buffeting and flutter, aeroelastic damping, gyroscopic effects, turbulence, among others, have pleaded for the use of a more sophisticated mathematical framework that could properly handle all these sources of indetermination. The attainable modeling complexity that arises as a result of these characterizations demands a data-driven experimental validation methodology to calibrate and corroborate the model. For this aim, System Identification (SI) techniques offer a spectrum of well-established numerical methods appropriated for stationary, deterministic, and data-driven numerical schemes, capable of predicting actual dynamic states (eigenrealizations) of traditional time-invariant dynamic systems. As a consequence, it is proposed a modified data-driven SI metric based on the so called Subspace Realization Theory, now adapted for stochastic non-stationary and timevarying systems, as is the case of HAWT’s complex aerodynamics. Simultaneously, this investigation explores the characterization of the turbine loading and response envelopes for critical failure modes of the structural components the wind turbine is made of. In the long run, both aerodynamic framework (theoretical model) and system identification (experimental model) will be merged in a numerical engine formulated as a search algorithm for model updating, also known as Adaptive Simulated Annealing (ASA) process. This iterative engine is based on a set of function minimizations computed by a metric called Modal Assurance Criterion (MAC). In summary, the Thesis is composed of four major parts: (1) development of an analytical aerodynamic framework that predicts interacted wind-structure stochastic loads on wind turbine components; (2) development of a novel tapered-swept-corved Spinning Finite Element (SFE) that includes dampedgyroscopic effects and axial-flexural-torsional coupling; (3) a novel data-driven structural health monitoring (SHM) algorithm via stochastic subspace identification methods; and (4) a numerical search (optimization) engine based on ASA and MAC capable of updating the SFE aerodynamic model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Amyloids and prion proteins are clinically and biologically important beta-structures, whose supersecondary structures are difficult to determine by standard experimental or computational means. In addition, significant conformational heterogeneity is known or suspected to exist in many amyloid fibrils. Recent work has indicated the utility of pairwise probabilistic statistics in beta-structure prediction. We develop here a new strategy for beta-structure prediction, emphasizing the determination of beta-strands and pairs of beta-strands as fundamental units of beta-structure. Our program, BETASCAN, calculates likelihood scores for potential beta-strands and strand-pairs based on correlations observed in parallel beta-sheets. The program then determines the strands and pairs with the greatest local likelihood for all of the sequence's potential beta-structures. BETASCAN suggests multiple alternate folding patterns and assigns relative a priori probabilities based solely on amino acid sequence, probability tables, and pre-chosen parameters. The algorithm compares favorably with the results of previous algorithms (BETAPRO, PASTA, SALSA, TANGO, and Zyggregator) in beta-structure prediction and amyloid propensity prediction. Accurate prediction is demonstrated for experimentally determined amyloid beta-structures, for a set of known beta-aggregates, and for the parallel beta-strands of beta-helices, amyloid-like globular proteins. BETASCAN is able both to detect beta-strands with higher sensitivity and to detect the edges of beta-strands in a richly beta-like sequence. For two proteins (Abeta and Het-s), there exist multiple sets of experimental data implying contradictory structures; BETASCAN is able to detect each competing structure as a potential structure variant. The ability to correlate multiple alternate beta-structures to experiment opens the possibility of computational investigation of prion strains and structural heterogeneity of amyloid. BETASCAN is publicly accessible on the Web at http://betascan.csail.mit.edu.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Results of epidemiological studies linking census with mortality records may be affected by unlinked deaths and changes in cause of death classification. We examined these issues in the Swiss National Cohort (SNC). METHODS The SNC is a longitudinal study of the entire Swiss population, based on the 1990 (6.8 million persons) and 2000 (7.3 million persons) censuses. Among 1,053,393 deaths recorded 1991-2007 5.4% could not be linked using stringent probabilistic linkage. We included the unlinked deaths using pragmatic linkages and compared mortality rates for selected causes with official mortality rates. We also examined the impact of the 1995 change in cause of death coding from version 8 (with some additional rules) to version 10 of the International Classification of Diseases (ICD), using Poisson regression models with restricted cubic splines. Finally, we compared results from Cox models including and excluding unlinked deaths of the association of education, marital status, and nationality with selected causes of death. RESULTS SNC mortality rates underestimated all cause mortality by 9.6% (range 2.4%-17.9%) in the 85+ population. Underestimation was less pronounced in years nearer the censuses and in the 75-84 age group. After including 99.7% of unlinked deaths, annual all cause SNC mortality rates were reflecting official rates (relative difference between -1.4% and +1.8%). In the 85+ population the rates for prostate and breast cancer dropped, by 16% and 21% respectively, between 1994 and 1995 coincident with the change in cause of death coding policy. For suicide in males almost no change was observed. Hazard ratios were only negligibly affected by including the unlinked deaths. A sudden decrease in breast (21% less, 95% confidence interval: 12%-28%) and prostate (16% less, 95% confidence interval: 7%-23%) cancer mortality rates in the 85+ population coincided with the 1995 change in cause of death coding policy. CONCLUSIONS Unlinked deaths bias analyses of absolute mortality rates downwards but have little effect on relative mortality. To describe time trends of cause-specific mortality in the SNC, accounting for the unlinked deaths and for the possible effect of change in death certificate coding was necessary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Atrial tachycardias (AT) during or after ablation of atrial fibrillation frequently pose a diagnostic challenge. We hypothesized that both the patterns and the timing of coronary sinus (CS) activation could facilitate AT mapping. METHODS AND RESULTS A total of 140 consecutive postpersistent atrial fibrillation ablation patients with sustained AT were investigated by conventional mapping. CS activation pattern was defined as chevron or reverse chevron when the activations recorded on both the proximal and the distal CS dipoles were latest or earliest, respectively. The local activation of mid-CS was timed with reference to Ppeak-Ppeak (P-P) interval in lead V1. A ratio, mid-CS activation time to AT cycle length, was computed. Of 223 diagnosed ATs, 124 were macroreentrant (56%) and 99 were centrifugal (44%). When CS activation was chevron/reverse chevron (n=44; 20%), macroreentries were mostly roof dependent. With reference to P-P interval, mid-CS activation timing showed specific consistency for peritricuspid and perimitral AT. Proximal to distal CS activation pattern and mid-CS activation at 50% to 70% of the P-P interval (n=30; 13%) diagnosed peritricuspid AT with 81% sensitivity and 89% specificity. Distal to proximal CS activation and mid-CS activation at 10% to 40% of the P-P interval (n=44; 20%) diagnosed perimitral AT with 88% sensitivity and 75% specificity. CONCLUSIONS The analysis of the patterns and timing of CS activation provides a rapid stratification of most likely macroreentrant ATs and points toward the likely origin of centrifugal ATs. It can be included in a stepwise diagnostic approach to rapidly select the most critical mapping maneuvers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reward systemin schizophrenia has been linked to the emergence of delusions on the one hand and to negative symptoms such as affective flattening on the other hand. Previous Diffusion Tensor Imaging (DTI) studies reported white matter microstructure alterations of regions related to the reward system. The present study aimed at extending these findings by specifically investigating connection pathways of the reward system in schizophrenia. Therefore, 24 patients with schizophrenia and 22 healthy controls matched for age and gender underwent DTI-scans. Using a probabilistic fiber tracking approachwe bilaterally extracted pathways connecting the ventral tegmental area (VTA) with the nucleus accumbens (NAcc), themedial and lateral orbitofrontal cortices (mOFC, lOFC), the dorsolateral prefrontal cortex (dlPFC) and the amygdala; as well as pathways connecting NAcc with mOFC, lOFC, dlPFC and amygdala resulting in a total of 18 connections. Probability indices forming part of a bundle of interest (PIBI) were compared between groups using independent t-tests. In 6 connection pathways PIBI-valueswere increased in schizophrenia. In 3 of these pathways the spatial extension of connection pathways was decreased. In schizophrenia patients, there was a negative correlation of PIBI-values and PANSS negative scores in the left VTA–amygdala and in the left NAcc–mOFC connection. A sum score of delusions and hallucinations correlated positively with PIBI-values of the left amygdala–NAcc connection. Structural organization of specific segments ofwhite matter pathways of the reward systemin schizophrenia may contribute to the emergence of delusions and negative symptoms in schizophrenia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Web-scale knowledge retrieval can be enabled by distributed information retrieval, clustering Web clients to a large-scale computing infrastructure for knowledge discovery from Web documents. Based on this infrastructure, we propose to apply semiotic (i.e., sub-syntactical) and inductive (i.e., probabilistic) methods for inferring concept associations in human knowledge. These associations can be combined to form a fuzzy (i.e.,gradual) semantic net representing a map of the knowledge in the Web. Thus, we propose to provide interactive visualizations of these cognitive concept maps to end users, who can browse and search the Web in a human-oriented, visual, and associative interface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We used micro-infusions during eyelid conditioning in rabbits to investigate the relative contributions of cerebellar cortex and the underlying deep nuclei (DCN) to the expression of cerebellar learning. These tests were conducted using two forms of cerebellum-dependent eyelid conditioning for which the relative roles of cerebellar cortex and DCN are controversial: delay conditioning, which is largely unaffected by forebrain lesions, and trace conditioning, which involves interactions between forebrain and cerebellum. For rabbits trained with delay conditioning, silencing cerebellar cortex by micro-infusions of the local anesthetic lidocaine unmasked stereotyped short-latency responses. This was also the case after extinction as observed previously with reversible blockade of cerebellar cortex output. Conversely, increasing cerebellar cortex activity by micro-infusions of the GABA(A) antagonist picrotoxin reversibly abolished conditioned responses. Effective cannula placements were clustered around the primary fissure and deeper in lobules hemispheric lobule IV (HIV) and hemispheric lobule V (HV) of anterior lobe. In well-trained trace conditioned rabbits, silencing this same area of cerebellar cortex or reversibly blocking cerebellar cortex output also unmasked short-latency responses. Because Purkinje cells are the sole output of cerebellar cortex, these results provide evidence that the expression of well-timed conditioned responses requires a well-timed decrease in the activity of Purkinje cells in anterior lobe. The parallels between results from delay and trace conditioning suggest similar contributions of plasticity in cerebellar cortex and DCN in both instances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How do probabilistic models represent their targets and how do they allow us to learn about them? The answer to this question depends on a number of details, in particular on the meaning of the probabilities involved. To classify the options, a minimalist conception of representation (Su\'arez 2004) is adopted: Modelers devise substitutes (``sources'') of their targets and investigate them to infer something about the target. Probabilistic models allow us to infer probabilities about the target from probabilities about the source. This leads to a framework in which we can systematically distinguish between different models of probabilistic modeling. I develop a fully Bayesian view of probabilistic modeling, but I argue that, as an alternative, Bayesian degrees of belief about the target may be derived from ontic probabilities about the source. Remarkably, some accounts of ontic probabilities can avoid problems if they are supposed to apply to sources only.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a study of Lunar and Mars settlement concepts, an analysis was made of fundamental design assumptions in five technical areas against a model list of occupational and environmental health concerns. The technical areas included the proposed science projects to be supported, habitat and construction issues, closed ecosystem issues, the "MMM" issues--mining, material-processing, and manufacturing, and the human elements of physiology, behavior and mission approach. Four major lessons were learned. First it is possible to relate public health concerns to complex technological development in a proactive design mode, which has the potential for long-term cost savings. Second, it became very apparent that prior to committing any nation or international group to spending the billions to start and complete a lunar settlement, over the next century, that a significantly different approach must be taken from those previously proposed, to solve the closed ecosystem and "MMM" problems. Third, it also appears that the health concerns and technology issues to be addressed for human exploration into space are fundamentally those to be solved for human habitation of the earth (as a closed ecosystem) in the 21st century. Finally, it is proposed that ecosystem design modeling must develop new tools, based on probabilistic models as a step up from closed circuit models. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AB A fundamental capacity of the human brain is to learn relations (contingencies) between environmental stimuli and the consequences of their occurrence. Some contingencies are probabilistic; that is, they predict an event in some situations but not in all. Animal studies suggest that damage to limbic structures or the prefrontal cortex may disturb probabilistic learning. The authors studied the learning of probabilistic contingencies in amnesic patients with limbic lesions, patients with prefrontal cortex damage, and healthy controls. Across 120 trials, participants learned contingent relations between spatial sequences and a button press. Amnesic patients had learning comparable to that of control subjects but failed to indicate what they had learned. Across the last 60 trials, amnesic patients and control subjects learned to avoid a noncontingent choice better than frontal patients. These results indicate that probabilistic learning does not depend on the brain structures supporting declarative memory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The number of older adults in the global population is increasing. This demographic shift leads to an increasing prevalence of age-associated disorders, such as Alzheimer's disease and other types of dementia. With the progression of the disease, the risk for institutional care increases, which contrasts with the desire of most patients to stay in their home environment. Despite doctors' and caregivers' awareness of the patient's cognitive status, they are often uncertain about its consequences on activities of daily living (ADL). To provide effective care, they need to know how patients cope with ADL, in particular, the estimation of risks associated with the cognitive decline. The occurrence, performance, and duration of different ADL are important indicators of functional ability. The patient's ability to cope with these activities is traditionally assessed with questionnaires, which has disadvantages (eg, lack of reliability and sensitivity). Several groups have proposed sensor-based systems to recognize and quantify these activities in the patient's home. Combined with Web technology, these systems can inform caregivers about their patients in real-time (e.g., via smartphone). OBJECTIVE We hypothesize that a non-intrusive system, which does not use body-mounted sensors, video-based imaging, and microphone recordings would be better suited for use in dementia patients. Since it does not require patient's attention and compliance, such a system might be well accepted by patients. We present a passive, Web-based, non-intrusive, assistive technology system that recognizes and classifies ADL. METHODS The components of this novel assistive technology system were wireless sensors distributed in every room of the participant's home and a central computer unit (CCU). The environmental data were acquired for 20 days (per participant) and then stored and processed on the CCU. In consultation with medical experts, eight ADL were classified. RESULTS In this study, 10 healthy participants (6 women, 4 men; mean age 48.8 years; SD 20.0 years; age range 28-79 years) were included. For explorative purposes, one female Alzheimer patient (Montreal Cognitive Assessment score=23, Timed Up and Go=19.8 seconds, Trail Making Test A=84.3 seconds, Trail Making Test B=146 seconds) was measured in parallel with the healthy subjects. In total, 1317 ADL were performed by the participants, 1211 ADL were classified correctly, and 106 ADL were missed. This led to an overall sensitivity of 91.27% and a specificity of 92.52%. Each subject performed an average of 134.8 ADL (SD 75). CONCLUSIONS The non-intrusive wireless sensor system can acquire environmental data essential for the classification of activities of daily living. By analyzing retrieved data, it is possible to distinguish and assign data patterns to subjects' specific activities and to identify eight different activities in daily living. The Web-based technology allows the system to improve care and provides valuable information about the patient in real-time.