961 resultados para Memory models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes solutions to three issues pertaining to the estimation of finite mixture models with an unknown number of components: the non-identifiability induced by overfitting the number of components, the mixing limitations of standard Markov Chain Monte Carlo (MCMC) sampling techniques, and the related label switching problem. An overfitting approach is used to estimate the number of components in a finite mixture model via a Zmix algorithm. Zmix provides a bridge between multidimensional samplers and test based estimation methods, whereby priors are chosen to encourage extra groups to have weights approaching zero. MCMC sampling is made possible by the implementation of prior parallel tempering, an extension of parallel tempering. Zmix can accurately estimate the number of components, posterior parameter estimates and allocation probabilities given a sufficiently large sample size. The results will reflect uncertainty in the final model and will report the range of possible candidate models and their respective estimated probabilities from a single run. Label switching is resolved with a computationally light-weight method, Zswitch, developed for overfitted mixtures by exploiting the intuitiveness of allocation-based relabelling algorithms and the precision of label-invariant loss functions. Four simulation studies are included to illustrate Zmix and Zswitch, as well as three case studies from the literature. All methods are available as part of the R package Zmix, which can currently be applied to univariate Gaussian mixture models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents an interdisciplinary analysis of how models and simulations function in the production of scientific knowledge. The work is informed by three scholarly traditions: studies on models and simulations in philosophy of science, so-called micro-sociological laboratory studies within science and technology studies, and cultural-historical activity theory. Methodologically, I adopt a naturalist epistemology and combine philosophical analysis with a qualitative, empirical case study of infectious-disease modelling. This study has a dual perspective throughout the analysis: it specifies the modelling practices and examines the models as objects of research. The research questions addressed in this study are: 1) How are models constructed and what functions do they have in the production of scientific knowledge? 2) What is interdisciplinarity in model construction? 3) How do models become a general research tool and why is this process problematic? The core argument is that the mediating models as investigative instruments (cf. Morgan and Morrison 1999) take questions as a starting point, and hence their construction is intentionally guided. This argument applies the interrogative model of inquiry (e.g., Sintonen 2005; Hintikka 1981), which conceives of all knowledge acquisition as process of seeking answers to questions. The first question addresses simulation models as Artificial Nature, which is manipulated in order to answer questions that initiated the model building. This account develops further the "epistemology of simulation" (cf. Winsberg 2003) by showing the interrelatedness of researchers and their objects in the process of modelling. The second question clarifies why interdisciplinary research collaboration is demanding and difficult to maintain. The nature of the impediments to disciplinary interaction are examined by introducing the idea of object-oriented interdisciplinarity, which provides an analytical framework to study the changes in the degree of interdisciplinarity, the tools and research practices developed to support the collaboration, and the mode of collaboration in relation to the historically mutable object of research. As my interest is in the models as interdisciplinary objects, the third research problem seeks to answer my question of how we might characterise these objects, what is typical for them, and what kind of changes happen in the process of modelling. Here I examine the tension between specified, question-oriented models and more general models, and suggest that the specified models form a group of their own. I call these Tailor-made models, in opposition to the process of building a simulation platform that aims at generalisability and utility for health-policy. This tension also underlines the challenge of applying research results (or methods and tools) to discuss and solve problems in decision-making processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using the promeasure technique, we give an alternative evaluation of a path integral corresponding to a quadratic action with a generalized memory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate unconditional skewness. We consider modeling the unconditional mean and variance using models that respond nonlinearly or asymmetrically to shocks. We investigate the implications of these models on the third-moment structure of the marginal distribution as well as conditions under which the unconditional distribution exhibits skewness and nonzero third-order autocovariance structure. In this respect, an asymmetric or nonlinear specification of the conditional mean is found to be of greater importance than the properties of the conditional variance. Several examples are discussed and, whenever possible, explicit analytical expressions provided for all third-order moments and cross-moments. Finally, we introduce a new tool, the shock impact curve, for investigating the impact of shocks on the conditional mean squared error of return series.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Climate matching software (CLIMEX) was used to prioritise areas to explore for biological control agents in the native range of cat's claw creeper Macfadyena unguis-cati (Bignoniaceae), and to prioritise areas to release the agents in the introduced ranges of the plant. The native distribution of cat's claw creeper was used to predict the potential range of climatically suitable habitats for cat's claw creeper in its introduced ranges. A Composite Match Index (CMI) of cat's claw creeper was determined with the 'Match Climates' function in order to match the ranges in Australia and South Africa where the plant is introduced with its native range in South and Central America. This information was used to determine which areas might yield climatically-adapted agents. Locations in northern Argentina had CMI values which best matched sites with cat's claw creeper infestations in Australia and South Africa. None of the sites from where three currently prioritised biological control agents for cat's claw creeper were collected had CMI values higher than 0.8. The analysis showed that central and eastern Argentina, south Brazil, Uruguay and parts of Bolivia and Paraguay should be prioritised for exploration for new biological control agents for cat's claw creeper to be used in Australia and South Africa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of identifying parameters of time invariant linear dynamical systems with fractional derivative damping models, based on a spatially incomplete set of measured frequency response functions and experimentally determined eigensolutions, is considered. Methods based on inverse sensitivity analysis of damped eigensolutions and frequency response functions are developed. It is shown that the eigensensitivity method requires the development of derivatives of solutions of an asymmetric generalized eigenvalue problem. Both the first and second order inverse sensitivity analyses are considered. The study demonstrates the successful performance of the identification algorithms developed based on synthetic data on one, two and a 33 degrees of freedom vibrating systems with fractional dampers. Limited studies have also been conducted by combining finite element modeling with experimental data on accelerances measured in laboratory conditions on a system consisting of two steel beams rigidly joined together by a rubber hose. The method based on sensitivity of frequency response functions is shown to be more efficient than the eigensensitivity based method in identifying system parameters, especially for large scale systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Knowledge of drag force is an important design parameter in aerodynamics. Measurement of aerodynamic forces at hypersonic speed is a challenge and usually ground test facilities like shock tunnels are used to carry out such tests. Accelerometer based force balances are commonly employed for measuring aerodynamic drag around bodies in hypersonic shock tunnels. In this study, we present an analysis of the effect of model material on the performance of an accelerometer balance used for measurement of drag in impulse facilities. From the experimental studies performed on models constructed out of Bakelite HYLEM and Aluminum, it is clear that the rigid body assumption does not hold good during the short testing duration available in shock tunnels. This is notwithstanding the fact that the rubber bush used for supporting the model allows unconstrained motion of the model during the short testing time available in the shock tunnel. The vibrations induced in the model on impact loading in the shock tunnel are damped out in metallic model, resulting in a smooth acceleration signal, while the signal become noisy and non-linear when we use non-isotropic materials like Bakelite HYLEM. This also implies that careful analysis and proper data reduction methodologies are necessary for measuring aerodynamic drag for non-metallic models in shock tunnels. The results from the drag measurements carried out using a 60 degrees half angle blunt cone is given in the present analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Public-Private Partnerships (PPP) are established globally as an important mode of procurement and the features of PPP, not least of which the transfer of risk, appeal to governments and particularly in the current economic climate. There are many other advantages of PPP that are claimed as outweighing the costs of PPP and affording Value for Money (VfM) relative to traditionally financed projects or non-PPP. That said, it is the case that we lack comparative whole-life empirical studies of VfM in PPP and non-PPP. Whilst we await this kind of study, the pace and trajectory of PPP seem set to continue and so in the meantime, the virtues of seeking to improve PPP appear incontrovertible. The decision about which projects, or parts of projects, to offer to the market as a PPP and the decision concerning the allocation or sharing risks as part of engagement of the PPP consortium are among the most fundamental decisions that determine whether PPP deliver VfM. The focus in the paper is on latter decision concerning governments’ attitudes towards risk and more specifically, the effect of this decision on the nature of the emergent PPP consortium, or PPP model, including its economic behavior and outcomes. This paper presents an exploration into the extent to which the seemingly incompatible alternatives of risk allocation and risk sharing, represented by the orthodox/conventional PPP model and the heterodox/alliance PPP model respectively, can be reconciled along with suggestions for new research directions to inform this reconciliation. In so doing, an important step is taken towards charting a path by which governments can harness the relative strengths of both kinds of PPP model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A constitutive modeling approach for shape memory alloy (SMA) wire by taking into account the microstructural phase inhomogeneity and the associated solid-solid phase transformation kinetics is reported in this paper. The approach is applicable to general thermomechanical loading. Characterization of various scales in the non-local rate sensitive kinetics is the main focus of this paper. Design of SMA materials and actuators not only involve an optimal exploitation of the hysteresis loops during loading-unloading, but also accounts for fatigue and training cycle identifications. For a successful design of SMA integrated actuator systems, it is essential to include the microstructural inhomogeneity effects and the loading rate dependence of the martensitic evolution, since these factors play predominant role in fatigue. In the proposed formulation, the evolution of new phase is assumed according to Weibull distribution. Fourier transformation and finite difference methods are applied to arrive at the analytical form of two important scaling parameters. The ratio of these scaling parameters is of the order of 10(6) for stress-free temperature-induced transformation and 10(4) for stress-induced transformation. These scaling parameters are used in order to study the effect of microstructural variation on the thermo-mechanical force and interface driving force. It is observed that the interface driving force is significant during the evolution. Increase in the slopes of the transformation start and end regions in the stress-strain hysteresis loop is observed for mechanical loading with higher rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Absenteeism is one of the major problems of Indian industries. It necessitates the employment of more manpower than the jobs require, resulting in the increase of manpower costs, and lowers the efficiency of plant operation through lowered performance and higher rejects. It also causes machine idleness, if extra manpower is not hired, resulting in disrupted work schedules and assignments. Several studies have investigated the causes of absenteeism (Vaid 1967) for example and their remedy and relationship between absenteeism and turnover with a suggested model for diagnosis and treatment (Hawk 1976) However, the production foremen and supervisor will face the operating task of determining how many extra operatives are to be hired in order to stave off the adverse effects of absenteeism on the man-machine system. This paper deals with a class of reserve manpower models based on the reject allowance model familiar in quality control literature. The present study considers, in addition to absenteeism, machine failures and the graded nature of manpower met within production systems and seeks to find optimal reserve manpower through computer simulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acute encephalitis is an inflammation of the brain, mostly caused by viral infection. A variety of cognitive symptoms may persist after the acute stage, and neuropsychological assessment is crucial in evaluation of the outcome. The most commonly reported sequelae are memory deficits. The main aims of this study were to investigate the types of memory impairment in various encephalitides, the frequency of global amnesia following encephalitis, and the changes in the deficits during follow-up. Between 1 January 1985 and 31 December 1994, 77 adult patients under the age of 75 with acute encephalitis but without alcohol abuse, or coexisting or previous neurological diseases were consecutively referred for neuropsychological examination at the Department of Neurology, Helsinki University Central Hospital. The aetiology was established in 44/77 (57%) patients; 17 had Herpes simplex virus encephalitis (HSVE). Transient amnesia (TENA) at the acute stage of the disease was found in 70% of patients. Furthermore, similarly to brain trauma, TENA was found to indicate cognitive outcome. The frequency of persisting global amnesia syndrome with both anterograde and retrograde amnesia in all encephalitic patients was 6%. One patient had isolated retrograde amnesia, which is very rare. In HSVE the frequency of global amnesia was 12.5%, which is lower than expected. As a group, HSVE patients were not found to have a homogeneous pattern of amnesia, instead subgroups among all encephalitic patients were observed: some patients had impaired semantic memory, some had difficulty predominantly with executive functions and some suffered from an increased forgetting rate. Herpes zoster encephalitis was found to result in mild memory impairment only, and the qualitative features indicated a subcortical dysfunction. On the whole, the cognitive deficits were predominantly found to diminish during follow-up. Progressive deterioration was often associated with intractable epilepsy. The frequency of dementia was 12.5%. In conclusion, the neuropsychological outcome, especially in HSVE, was more favourable than has previously been reported, possibly due to early acyclovir medication. Memory disorders after encephalitis should not be considered uniform, and the need for neuropsychological rehabilitation should be considered case-by-case

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distraction in the workplace is increasingly more common in the information age. Several tasks and sources of information compete for a worker's limited cognitive capacities in human-computer interaction (HCI). In some situations even very brief interruptions can have detrimental effects on memory. Nevertheless, in other situations where persons are continuously interrupted, virtually no interruption costs emerge. This dissertation attempts to reveal the mental conditions and causalities differentiating the two outcomes. The explanation, building on the theory of long-term working memory (LTWM; Ericsson and Kintsch, 1995), focuses on the active, skillful aspects of human cognition that enable the storage of task information beyond the temporary and unstable storage provided by short-term working memory (STWM). Its key postulate is called a retrieval structure an abstract, hierarchical knowledge representation built into long-term memory that can be utilized to encode, update, and retrieve products of cognitive processes carried out during skilled task performance. If certain criteria of practice and task processing are met, LTWM allows for the storage of large representations for long time periods, yet these representations can be accessed with the accuracy, reliability, and speed typical of STWM. The main thesis of the dissertation is that the ability to endure interruptions depends on the efficiency in which LTWM can be recruited for maintaing information. An observational study and a field experiment provide ecological evidence for this thesis. Mobile users were found to be able to carry out heavy interleaving and sequencing of tasks while interacting, and they exhibited several intricate time-sharing strategies to orchestrate interruptions in a way sensitive to both external and internal demands. Interruptions are inevitable, because they arise as natural consequences of the top-down and bottom-up control of multitasking. In this process the function of LTWM is to keep some representations ready for reactivation and others in a more passive state to prevent interference. The psychological reality of the main thesis received confirmatory evidence in a series of laboratory experiments. They indicate that after encoding into LTWM, task representations are safeguarded from interruptions, regardless of their intensity, complexity, or pacing. However, when LTWM cannot be deployed, the problems posed by interference in long-term memory and the limited capacity of the STWM surface. A major contribution of the dissertation is the analysis of when users must resort to poorer maintenance strategies, like temporal cues and STWM-based rehearsal. First, one experiment showed that task orientations can be associated with radically different patterns of retrieval cue encodings. Thus the nature of the processing of the interface determines which features will be available as retrieval cues and which must be maintained by other means. In another study it was demonstrated that if the speed of encoding into LTWM, a skill-dependent parameter, is slower than the processing speed allowed for by the task, interruption costs emerge. Contrary to the predictions of competing theories, these costs turned out to involve intrusions in addition to omissions. Finally, it was learned that in rapid visually oriented interaction, perceptual-procedural expectations guide task resumption, and neither STWM nor LTWM are utilized due to the fact that access is too slow. These findings imply a change in thinking about the design of interfaces. Several novel principles of design are presented, basing on the idea of supporting the deployment of LTWM in the main task.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neuronal oscillations are thought to underlie interactions between distinct brain regions required for normal memory functioning. This study aimed at elucidating the neuronal basis of memory abnormalities in neurodegenerative disorders. Magnetoencephalography (MEG) was used to measure oscillatory brain signals in patients with Alzheimer s disease (AD), a neurodegenerative disease causing progressive cognitive decline, and mild cognitive impairment (MCI), a disorder characterized by mild but clinically significant complaints of memory loss without apparent impairment in other cognitive domains. Furthermore, to help interpret our AD/MCI results and to develop more powerful oscillatory MEG paradigms for clinical memory studies, oscillatory neuronal activity underlying declarative memory, the function which is afflicted first in both AD and MCI, was investigated in a group of healthy subjects. An increased temporal-lobe contribution coinciding with parieto-occipital deficits in oscillatory activity was observed in AD patients: sources in the 6 12.5 Hz range were significantly stronger in the parieto-occipital and significantly weaker in the right temporal region in AD patients, as compared to MCI patients and healthy elderly subjects. Further, the auditory steady-state response, thought to represent both evoked and induced activity, was enhanced in AD patients, as compared to controls, possibly reflecting decreased inhibition in auditory processing and deficits in adaptation to repetitive stimulation with low relevance. Finally, the methodological study revealed that successful declarative encoding and retrieval is associated with increases in occipital gamma and right hemisphere theta power in healthy unmedicated subjects. This result suggests that investigation of neuronal oscillations during cognitive performance could potentially be used to investigate declarative memory deficits in AD patients. Taken together, the present results provide an insight on the role of brain oscillatory activity in memory function and memory disorders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the problem of selecting users in an online social network for targeted advertising so as to maximize the adoption of a given product. In previous work, two families of models have been considered to address this problem: direct targeting and network-based targeting. The former approach targets users with the highest propensity to adopt the product, while the latter approach targets users with the highest influence potential – that is users whose adoption is most likely to be followed by subsequent adoptions by peers. This paper proposes a hybrid approach that combines a notion of propensity and a notion of influence into a single utility function. We show that targeting a fixed number of high-utility users results in more adoptions than targeting either highly influential users or users with high propensity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modeling of cultivar x trial effects for multienvironment trials (METs) within a mixed model framework is now common practice in many plant breeding programs. The factor analytic (FA) model is a parsimonious form used to approximate the fully unstructured form of the genetic variance-covariance matrix in the model for MET data. In this study, we demonstrate that the FA model is generally the model of best fit across a range of data sets taken from early generation trials in a breeding program. In addition, we demonstrate the superiority of the FA model in achieving the most common aim of METs, namely the selection of superior genotypes. Selection is achieved using best linear unbiased predictions (BLUPs) of cultivar effects at each environment, considered either individually or as a weighted average across environments. In practice, empirical BLUPs (E-BLUPs) of cultivar effects must be used instead of BLUPs since variance parameters in the model must be estimated rather than assumed known. While the optimal properties of minimum mean squared error of prediction (MSEP) and maximum correlation between true and predicted effects possessed by BLUPs do not hold for E-BLUPs, a simulation study shows that E-BLUPs perform well in terms of MSEP.