933 resultados para non-trivial data structures


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Domain-specific languages (DSLs) are increasingly used as embedded languages within general-purpose host languages. DSLs provide a compact, dedicated syntax for specifying parts of an application related to specialized domains. Unfortunately, such language extensions typically do not integrate well with the development tools of the host language. Editors, compilers and debuggers are either unaware of the extensions, or must be adapted at a non-trivial cost. We present a novel approach to embed DSLs into an existing host language by leveraging the underlying representation of the host language used by these tools. Helvetia is an extensible system that intercepts the compilation pipeline of the Smalltalk host language to seamlessly integrate language extensions. We validate our approach by case studies that demonstrate three fundamentally different ways to extend or adapt the host language syntax and semantics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stimulation of human epileptic tissue can induce rhythmic, self-terminating responses on the EEG or ECoG. These responses play a potentially important role in localising tissue involved in the generation of seizure activity, yet the underlying mechanisms are unknown. However, in vitro evidence suggests that self-terminating oscillations in nervous tissue are underpinned by non-trivial spatio-temporal dynamics in an excitable medium. In this study, we investigate this hypothesis in spatial extensions to a neural mass model for epileptiform dynamics. We demonstrate that spatial extensions to this model in one and two dimensions display propagating travelling waves but also more complex transient dynamics in response to local perturbations. The neural mass formulation with local excitatory and inhibitory circuits, allows the direct incorporation of spatially distributed, functional heterogeneities into the model. We show that such heterogeneities can lead to prolonged reverberating responses to a single pulse perturbation, depending upon the location at which the stimulus is delivered. This leads to the hypothesis that prolonged rhythmic responses to local stimulation in epileptogenic tissue result from repeated self-excitation of regions of tissue with diminished inhibitory capabilities. Combined with previous models of the dynamics of focal seizures this macroscopic framework is a first step towards an explicit spatial formulation of the concept of the epileptogenic zone. Ultimately, an improved understanding of the pathophysiologic mechanisms of the epileptogenic zone will help to improve diagnostic and therapeutic measures for treating epilepsy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

'Weak senses' are a specific type of semantic information as opposed to assertions and presuppositions. The universal trait of weak senses is that they assume 'if' modality in negative contexts. In addition they exhibit several other diagnostic properties, e.g. they fill at least one of their valency places with a semantic element sensitive to negation (i.e. with an assertion or other weak sense), they normally do not fall within the scope of functors, do not play any role in causal relations, and resist intensification. As weak senses are widespread in lexical, grammatical and referential semantics, this notion holds the clue to phenomena as diverse as the oppositions little - a little, few - a few, edva ('hardly') - cut' ('slightly), where a little, a few, cut, convey 'weakly' approximately what little, few, and edva do in an assertive way, the semantics of the Russian perfect aspect, and the formation rules for conjunction strings. Zeldovich outlines a typology of weak senses, the main distinction being between weak senses unilaterally dependent upon the truthfulness of what they saturate their valency with, and weak senses exerting their own influence on the main situation. The latter, called, non-trivial, are instantiated by existential quantifiers involved in the semantics of indefinite pronouns, iterative verbs, etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A cascading failure is a failure in a system of interconnected parts, in which the breakdown of one element can lead to the subsequent collapse of the others. The aim of this paper is to introduce a simple combinatorial model for the study of cascading failures. In particular, having in mind particle systems and Markov random fields, we take into consideration a network of interacting urns displaced over a lattice. Every urn is Pólya-like and its reinforcement matrix is not only a function of time (time contagion) but also of the behavior of the neighboring urns (spatial contagion), and of a random component, which can represent either simple fate or the impact of exogenous factors. In this way a non-trivial dependence structure among the urns is built, and it is used to study default avalanches over the lattice. Thanks to its flexibility and its interesting probabilistic properties, the given construction may be used to model different phenomena characterized by cascading failures such as power grids and financial networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Delayed uterine involution has negative effects on the fertility of cows; use of prostaglandin F2alpha alone as a single treatment has not been shown to consistently improve fertility. Combined administration of PGF2alpha and PGE2 increased uterine pressure in healthy cows. We hypothesized, that the combination of both prostaglandins would accelerate uterine involution and have, therefore, a positive effect on fertility variables. In commercial dairy farming, the benefit of a single post partum combined prostaglandin treatment should be demonstrated. METHODS: 383 cows from commercial dairy farms were included in this study. Uterine size and secretion were evaluated at treatment 21-35 days post partum and 14 days later. Cows were randomly allocated to one of three treatment groups: PGF2alpha and PGE2, PGF2alpha or placebo. For every animal participating in the study, the following reproduction variables were recorded: Interval from calving to first insemination, days open, number of artificial inseminations (AI) to conception; subsequent treatment of uterus, subsequent treatment of ovaries. Plasma progesterone level at time of treatment was used as a covariable. For continuous measurements, analysis of variance was performed. Fisher's exact test for categorical non-ordered data and exact Kruskal-Wallis test for ordered data were used; pairwise group comparisons with Bonferroni adjustment of significance level were performed. RESULTS: There was no significant difference among treatment groups in uterine size. Furthermore, there was no significant difference among treatments concerning days open, number of AI, and subsequent treatment of uterus and ovaries. Days from calving to first insemination tended to be shorter for cows with low progesterone level given PGF2alpha and PGE2 in combination than for the placebo-group (P = 0.024). CONCLUSION: The results of this study indicate that the administration of PGF2alpha or a combination of PGF2alpha and PGE2 21 to 35 days post partum had no beneficial effect upon measured fertility variables. The exception was a tendency for a shorter interval from calving to first insemination after administration of the combination of PGF2alpha and PGE2, as compared to the placebo group. Further research should be done in herds with reduced fertility and/or an increased incidence of postpartum vaginal discharge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In epidemiological work, outcomes are frequently non-normal, sample sizes may be large, and effects are often small. To relate health outcomes to geographic risk factors, fast and powerful methods for fitting spatial models, particularly for non-normal data, are required. We focus on binary outcomes, with the risk surface a smooth function of space. We compare penalized likelihood models, including the penalized quasi-likelihood (PQL) approach, and Bayesian models based on fit, speed, and ease of implementation. A Bayesian model using a spectral basis representation of the spatial surface provides the best tradeoff of sensitivity and specificity in simulations, detecting real spatial features while limiting overfitting and being more efficient computationally than other Bayesian approaches. One of the contributions of this work is further development of this underused representation. The spectral basis model outperforms the penalized likelihood methods, which are prone to overfitting, but is slower to fit and not as easily implemented. Conclusions based on a real dataset of cancer cases in Taiwan are similar albeit less conclusive with respect to comparing the approaches. The success of the spectral basis with binary data and similar results with count data suggest that it may be generally useful in spatial models and more complicated hierarchical models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a paleoclimatic/paleoenvironmental study conducted on clastic cave sediments of the Moravian Karst, Czech Republic. The study is based on environmental magnetic techniques, yet a wide range of other scientific methods was used to obtain a clearer picture of the Quaternary climate. My thesis also presents an overview of the significance of cave deposits for paleoclimatic reconstructions, explains basic environmental magnetic techniques and offers background information on the study area – a famous karst region in Central Europe with a rich history. In Kulna Cave magnetic susceptibility variations and in particular variations in pedogenic susceptibility yield a detailed record of the palaeoenvironmental conditions during the Last Glacial Stage. The Kulna long-term climatic trends agree with the deep-sea SPECMAP record, while the short-term oscillations correlate with rapid changes in the North Atlantic sea surface temperatures. Kulna Cave sediments reflect the intensity of pedogenesis controlled by short-term warmer events and precipitation over the mid-continent and provide a link between continental European climate and sea surface temperatures in the North Atlantic during the Last Glacial Stage. Given the number of independent climate proxies determined from the entrance facies of the cave and their high resolution, Kulna is an extremely important site for studying Late Pleistocene climate. In the interior of Spiralka Cave, a five meter high section of fine grained sediments deposited during floods yields information on the climatic and environmental conditions of the last millenium. In the upper 1.5 meters of this profile, mineral magnetic and other non-magnetic data indicate that susceptibility variations are controlled by the concentration of magnetite and its magnetic grain size. Comparison of our susceptibility record to the instrumental record of winter temperature anomalies shows a remarkable correlation. This correlation is explained by coupling of the flooding events, cultivation of land and pedogenetic processes in the cave catchment area. A combination of mineral magnetic and geochemical proxies yields a detail picture of the rapidly evolving climate of the near past and tracks both natural and human induced environmental changes taking place in the broader region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interest in the study of magnetic/non-magnetic multilayered structures took a giant leap since Grünberg and his group established that the interlayer exchange coupling (IEC) is a function of the non-magnetic spacer width. This interest was further fuelled by the discovery of the phenomenal Giant Magnetoresistance (GMR) effect. In fact, in 2007 Albert Fert and Peter Grünberg were awarded the Nobel Prize in Physics for their contribution to the discovery of GMR. GMR is the key property that is being used in the read-head of the present day computer hard drive as it requires a high sensitivity in the detection of magnetic field. The recent increase in demand for device miniaturization encouraged researchers to look for GMR in nanoscale multilayered structures. In this context, one dimensional(1-D) multilayerd nanowire structure has shown tremendous promise as a viable candidate for ultra sensitive read head sensors. In fact, the phenomenal giant magnetoresistance(GMR) effect, which is the novel feature of the currently used multilayered thin film, has already been observed in multilayered nanowire systems at ambient temperature. Geometrical confinement of the supper lattice along the 2-dimensions (2-D) to construct the 1-D multilayered nanowire prohibits the minimization of magnetic interaction- offering a rich variety of magnetic properties in nanowire that can be exploited for novel functionality. In addition, introduction of non-magnetic spacer between the magnetic layers presents additional advantage in controlling magnetic properties via tuning the interlayer magnetic interaction. Despite of a large volume of theoretical works devoted towards the understanding of GMR and IEC in super lattice structures, limited theoretical calculations are reported in 1-D multilayered systems. Thus to gauge their potential application in new generation magneto-electronic devices, in this thesis, I have discussed the usage of first principles density functional theory (DFT) in predicting the equilibrium structure, stability as well as electronic and magnetic properties of one dimensional multilayered nanowires. Particularly, I have focused on the electronic and magnetic properties of Fe/Pt multilayered nanowire structures and the role of non-magnetic Pt spacer in modulating the magnetic properties of the wire. It is found that the average magnetic moment per atom in the nanowire increases monotonically with an ~1/(N(Fe)) dependance, where N(Fe) is the number of iron layers in the nanowire. A simple model based upon the interfacial structure is given to explain the 1/(N(Fe)) trend in magnetic moment obtained from the first principle calculations. A new mechanism, based upon spin flip with in the layer and multistep electron transfer between the layers, is proposed to elucidate the enhancement of magnetic moment of Iron atom at the Platinum interface. The calculated IEC in the Fe/Pt multilayered nanowire is found to switch sign as the width of the non-magnetic spacer varies. The competition among short and long range direct exchange and the super exchange has been found to play a key role for the non-monotonous sign in IEC depending upon the width of the Platinum spacer layer. The calculated magnetoresistance from Julliere's model also exhibit similar switching behavior as that of IEC. The universality of the behavior of exchange coupling has also been looked into by introducing different non-magnetic spacers like Palladium, Copper, Silver, and Gold in between magnetic Iron layers. The nature of hybridization between Fe and other non-magnetic spacer is found to dictate the inter layer magnetic interaction. For example, in Fe/Pd nanowire the d-p hybridization in two spacer layer case favors anti-ferromagnetic (AFM) configuration over ferromagnetic (FM) configuration. However, the hybridization between half-filled Fe(d) and filled Cu(p) state in Fe/Cu nanowire favors FM coupling in the 2-spacer system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Questionnaire data may contain missing values because certain questions do not apply to all respondents. For instance, questions addressing particular attributes of a symptom, such as frequency, triggers or seasonality, are only applicable to those who have experienced the symptom, while for those who have not, responses to these items will be missing. This missing information does not fall into the category 'missing by design', rather the features of interest do not exist and cannot be measured regardless of survey design. Analysis of responses to such conditional items is therefore typically restricted to the subpopulation in which they apply. This article is concerned with joint multivariate modelling of responses to both unconditional and conditional items without restricting the analysis to this subpopulation. Such an approach is of interest when the distributions of both types of responses are thought to be determined by common parameters affecting the whole population. By integrating the conditional item structure into the model, inference can be based both on unconditional data from the entire population and on conditional data from subjects for whom they exist. This approach opens new possibilities for multivariate analysis of such data. We apply this approach to latent class modelling and provide an example using data on respiratory symptoms (wheeze and cough) in children. Conditional data structures such as that considered here are common in medical research settings and, although our focus is on latent class models, the approach can be applied to other multivariate models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Writing unit tests for legacy systems is a key maintenance task. When writing tests for object-oriented programs, objects need to be set up and the expected effects of executing the unit under test need to be verified. If developers lack internal knowledge of a system, the task of writing tests is non-trivial. To address this problem, we propose an approach that exposes side effects detected in example runs of the system and uses these side effects to guide the developer when writing tests. We introduce a visualization called Test Blueprint, through which we identify what the required fixture is and what assertions are needed to verify the correct behavior of a unit under test. The dynamic analysis technique that underlies our approach is based on both tracing method executions and on tracking the flow of objects at runtime. To demonstrate the usefulness of our approach we present results from two case studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents a study of the staging and implementation of death and the death penalty in a number of popular MMOGs and relates it to players general experience of gameworlds. Game mechanics, writings and stories by designers and players, and the results of an online survey are analysed and discussed. The study shows that the death penalty is implemented much in the same way across worlds; that death can be both trivial and non-trivial, part of the grind of everyday life, or essential in the creation of heroes, depending on context. In whatever function death may serves, it is argued that death plays an important part in the shaping and emergence of the social culture of a world, and in the individual players experience of life within it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Having to carry input devices can be inconvenient when interacting with wall-sized, high-resolution tiled displays. Such displays are typically driven by a cluster of computers. Running existing games on a cluster is non-trivial, and the performance attained using software solutions like Chromium is not good enough. This paper presents a touch-free, multi-user, humancomputer interface for wall-sized displays that enables completely device-free interaction. The interface is built using 16 cameras and a cluster of computers, and is integrated with the games Quake 3 Arena (Q3A) and Homeworld. The two games were parallelized using two different approaches in order to run on a 7x4 tile, 21 megapixel display wall with good performance. The touch-free interface enables interaction with a latency of 116 ms, where 81 ms are due to the camera hardware. The rendering performance of the games is compared to their sequential counterparts running on the display wall using Chromium. Parallel Q3A’s framerate is an order of magnitude higher compared to using Chromium. The parallel version of Homeworld performed on par with the sequential, which did not run at all using Chromium. Informal use of the touch-free interface indicates that it works better for controlling Q3A than Homeworld.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present in this paper several contributions on the collision detection optimization centered on hardware performance. We focus on the broad phase which is the first step of the collision detection process and propose three new ways of parallelization of the well-known Sweep and Prune algorithm. We first developed a multi-core model takes into account the number of available cores. Multi-core architecture enables us to distribute geometric computations with use of multi-threading. Critical writing section and threads idling have been minimized by introducing new data structures for each thread. Programming with directives, like OpenMP, appears to be a good compromise for code portability. We then proposed a new GPU-based algorithm also based on the "Sweep and Prune" that has been adapted to multi-GPU architectures. Our technique is based on a spatial subdivision method used to distribute computations among GPUs. Results show that significant speed-up can be obtained by passing from 1 to 4 GPUs in a large-scale environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the 2-d O(3) model with a q-term as a toy model for slowly walking 4-d non-Abelian gauge theories. Using the very efficient meron-cluster algorithm, an accurate investigation of the scale dependence of the renormalized coupling is carried out for different values of the vacuum angle q. Approaching q = p, the infrared dynamics of the 2-d O(3) model is determined by a non-trivial conformal fixed point. We provide evidence for a slowly walking behavior near the fixed point and we perform a finite-size scaling analysis of the mass gap.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic instability in mammalian cells can occur by many different mechanisms. In the absence of exogenous sources of DNA damage, the DNA structure itself has been implicated in genetic instability. When the canonical B-DNA helix is naturally altered to form a non-canonical DNA structure such as a Z-DNA or H-DNA, this can lead to genetic instability in the form of DNA double-strand breaks (DSBs) (1, 2). Our laboratory found that the stability of these non-B DNA structures was different in mammals versus Escherichia coli (E.coli) bacteria (1, 2). One explanation for the difference between these species may be a result of how DSBs are repaired within each species. Non-homologous end-joining (NHEJ) is primed to repair DSBs in mammalian cells, while bacteria that lack NHEJ (such as E.coli), utilize homologous recombination (HR) to repair DSBs. To investigate the role of the error-prone NHEJ repair pathway in DNA structure-induced genetic instability, E.coli cells were modified to express genes to allow for a functional NHEJ system under different HR backgrounds. The Mycobacterium tuberculosis NHEJ sufficient system is composed of Ku and Ligase D (LigD) (3). These inducible NHEJ components were expressed individually and together in E.coli cells, with or without functional HR (RecA/RecB), and the Z-DNA and H-DNA-induced mutations were characterized. The Z-DNA structure gave rise to higher mutation frequencies compared to the controls, regardless of the DSB repair pathway(s) available; however, the type of mutants produced after repair was greatly dictated on the available DSB repair system, indicated by the shift from 2% large-scale deletions in the total mutant population to 24% large-scale deletions when NHEJ was present (4). This suggests that NHEJ has a role in the large deletions induced by Z-DNA-forming sequences. H-DNA structure, however, did not exhibit an increase in mutagenesis in the newly engineered E.coli environment, suggesting the involvement of other factors in regulating H-DNA formation/stability in bacterial cells. Accurate repair by established DNA DSB repair pathways is essential to maintain the stability of eukaryotic and prokaryotic genomes and our results suggest that an error-prone NHEJ pathway was involved in non-B DNA structure-induced mutagenesis in both prokaryotes and eukaryotes.