929 resultados para Model-based
Resumo:
With the aim to compare the cost of treatment for rheumatoid arthritis therapy with desease-modifying antirheumatic drugs (DMARDS) for a 48-month period, were studied five different treatment stage based on clinical protocols recommended by the Brazilian Society of Rheumatology, and then five therapy cycles. The analytical model based on the Markov Analysis, considered chaces for the patient continue in some stages or change between them according with a positive effect on outcomes. Only direct costs were comprised in the analyzed data, like drugs, materials and tests used for monitoring these patients. The results of the model show that the stage in with metotrexato drug is used like monotherapy was cost-effective (R$ 113,900,00 for patient during 48 months), followed by refractory patient (R$ 1,554,483,43), those that use therapy triplicate followed by infleximable drug (R$ 1, 701, 286.76), the metotrexato intolearant patient (R$ 2,629,919,14), and final the result from that use metotrexato and infliximable in the beginning (R$ 9,292,879,31). The sensitivity analysis confirm this results, when alternate the efficacy of metotrexato and infliximabe.
Resumo:
In this study, hypotheses were tested that the quality of leader-member exchanges (LMX) depends on congruity of values between leader and member. Data on negotiating latitude and personal values were gathered from 160 members of 30 work groups in Australian organizations. Factor analysis revealed 5 value dimensions: Freedom, Achievement, Mateship, Obedience, and Coping. Analyses of variance supported the hypothesis that LMX quality is higher when leaders and members share achievement and obedience values. Subsequent exploratory analysis, however, indicated that a more complex model based on compatibility of leader authority and member affiliation values may provide a more complete representation.
Resumo:
A feedback model based on direct photodetection and micromaser-like atomic injection is proposed for the preservation of quantum coherence in a cavity. We show that in this way it is possible to slow down significantly the decoherence of Schrodinger cat states.
Resumo:
A previous mathematical model explaining dengue in Singapore predicted a reasonable outbreak of about 6500 cases for 2006 and a very mild outbreak with about 2000 cases for 2007. However, only 3051 cases were reported in 2006 while more than 7800 were reported in the first 44 weeks of 2007. We hypothesized that the combination of haze with other local sources of particulate matter had a significant impact on mosquito life expectancy, significantly increasing their mortality rate. To test the hypothesis a mathematical model based on the reproduction number of dengue fever and aimed at comparing the impact of several possible alternative control strategies was proposed. This model also aimed at contributing to the understanding of the causes of dengue resurgence in Singapore in the last decade. The model`s simulation demonstrated that an increase in mosquito mortality in 2006 and either a reduction in mortality or an increase in the carrying capacity of mosquitoes in 2007 explained the patterned observed in Singapore. Based on the model`s simulation we concluded that the fewer than expected number of dengue cases in Singapore in 2006 was caused by an increase in mosquito mortality due to the disproportionate haze affecting the country that year and that particularly favourable environmental conditions in 2007 propitiated mosquitoes with a lower mortality rate, which explains the greater than expected number of dengue cases in 2007. Whether our hypothesis is plausible or not should be debated further.
Resumo:
Weiss and Isen have provided many supportive comments about the multi-level perspective, but also found limitations. Isen noted the importance of integrating affect, cognition, and motivation. Weiss commented similarly that the model lacked an integrating “thread.” He suggested that, to be truly multilevel, each level should constrain processes at other levels, and also provide guidance for the development of new concepts. Weiss also noted that the focus on biological processes was a strength of the model. I respond by suggesting that these very biological processes may constitute the “missing” thread. To illustrate this, I discuss some of the recent research on emotions in organizational settings, and argue that biology both constrains and guides theory at each level of the model. Based on this proposition, I revisit each of the five levels in the model, to demonstrate how this integration can be accomplished in this fashion. Finally, I address two additional points: aggregation bias, and the possibility of extending the model to include higher levels of industry and region.
Resumo:
The superior cervical ganglion (SCG) in mammals varies in structure according to developmental age, body size, gender, lateral asymmetry, the size and nuclear content of neurons and the complexity and synaptic coverage of their dendritic trees. In small and medium-sized mammals, neuron number and size increase from birth to adulthood and, in phylogenetic studies, vary with body size. However, recent studies on larger animals suggest that body weight does not, in general, accurately predict neuron number. We have applied design-based stereological tools at the light-microscopic level to assess the volumetric composition of ganglia and to estimate the numbers and sizes of neurons in SCGs from rats, capybaras and horses. Using transmission electron microscopy, we have obtained design-based estimates of the surface coverage of dendrites by postsynaptic apposition zones and model-based estimates of the numbers and sizes of synaptophysin-labelled axo-dendritic synaptic disks. Linear regression analysis of log-transformed data has been undertaken in order to establish the nature of the relationships between numbers and SCG volume (V(scg)). For SCGs (five per species), the allometric relationship for neuron number (N) is N=35,067xV (scg) (0.781) and that for synapses is N=20,095,000xV (scg) (1.328) , the former being a good predictor and the latter a poor predictor of synapse number. Our findings thus reveal the nature of SCG growth in terms of its main ingredients (neurons, neuropil, blood vessels) and show that larger mammals have SCG neurons exhibiting more complex arborizations and greater numbers of axo-dendritic synapses.
Resumo:
High-level microsatellite instability (AISI-H) is demonstrated in 10 to 15% of sporadic colorectal cancers and in most cancers presenting In the inherited condition hereditary nonpolyposis colorectal cancer (HNPCC). Distinction between these categories of MSI-H cancer is of clinical importance and the aim of this study was to assess clinical, pathological, and molecular features that might he discriminatory. One hundred and twelve MSI-H colorectal cancers from families fulfilling the Bethesda criteria were compared with 57 sporadic MSI-H colorectal cancers. HNPCC cancers presented at a lower age (P < 0.001) with no sporadic MSI-H cancer being diagnosed before the age of 57 years. MSI was less extensive in HNPCC cancers with 72% microsatellite markers showing band shifts compared with 87% in sporadic tumors (P < 0.001). Absent immunostaining for hMSH2 was only found in HNPCC tumors. Methylation of bMLH1 was observed in 87% of sporadic cancers but also in 55% of HNPCC tumors that showed loss of expression of hMLH1 (P = 0.02). HNPCC cancers were more frequently characterized by aberrant beta -catenin immunostaining as evidenced by nuclear positivity (P < 0.001). Aberrant p53 immunostaining was infrequent in both groups. There were no differences with respect to 5q loss of heterozygosity or codon 12 K-ras mutation, which were infrequent in both groups. Sporadic MSI-H cancers were more frequently heterogeneous (P < 0.001), poorly differentiated (P = 0.02), mucinous (P = 0.02), and proximally located (P = 0.04) than RNPCC tumors. In sporadic MSI-H cancers, contiguous adenomas were likely to be serrated whereas traditional adenomas were dominant in HNPCC. Lymphocytic infiltration was more pronounced in HNPCC but the results did not reach statistical significance. Overall, HNPCC cancers were more like common colorectal cancer in terms of morphology and expression of beta -catenin whereas sporadic MSI-H cancers displayed features consistent with a different morphogenesis. No individual feature was discriminatory for all RN-PCC cancers. However, a model based on four features was able to classify 94.5% of tumors as sporadic or HNPCC. The finding of multiple differences between sporadic and familial MSI-H colorectal cancer with respect to both genotype and phenotype is consistent with tumorigenesis through parallel evolutionary pathways and emphasizes the importance of studying the two groups separately.
Resumo:
1. The past 15 years has seen the emergence of a new field of neuroscience research based primarily on how the immune system and the central nervous system can interact. A notable example of this interaction occurs when peripheral inflammation, infection or tissue injury activates the hypothalamic- pituitary-adrenal axis (HPA). 2. During such assaults, immune cells release the pro- inflammatory cytokines interleukin (IL)-1, IL-6 and tumour necrosis factor-alpha into the general circulation. 3. These cytokines are believed to act as mediators for HPA axis activation. However, physical limitations of cytokines impede their movement across the blood-brain barrier and, consequently, it has been unclear as to precisely how and where IL-1beta signals cross into the brain to trigger HPA axis activation. 4. Evidence from recent anatomical and functional studies suggests two neuronal networks may be involved in triggering HPA axis activity in response to circulating cytokines. These are catecholamine cells of the medulla oblongata and the circumventricular organs (CVO). 5. The present paper examines the role of CVO in generating HPA axis responses to pro-inflammatory cytokines and culminates with a proposed model based on cytokine signalling primarily involving the area postrema and catecholamine cells in the ventrolateral and dorsal medulla.
Resumo:
Map algebra is a data model and simple functional notation to study the distribution and patterns of spatial phenomena. It uses a uniform representation of space as discrete grids, which are organized into layers. This paper discusses extensions to map algebra to handle neighborhood operations with a new data type called a template. Templates provide general windowing operations on grids to enable spatial models for cellular automata, mathematical morphology, and local spatial statistics. A programming language for map algebra that incorporates templates and special processing constructs is described. The programming language is called MapScript. Example program scripts are presented to perform diverse and interesting neighborhood analysis for descriptive, model-based and processed-based analysis.
Resumo:
The present paper addresses two major concerns that were identified when developing neural network based prediction models and which can limit their wider applicability in the industry. The first problem is that it appears neural network models are not readily available to a corrosion engineer. Therefore the first part of this paper describes a neural network model of CO2 corrosion which was created using a standard commercial software package and simple modelling strategies. It was found that such a model was able to capture practically all of the trends noticed in the experimental data with acceptable accuracy. This exercise has proven that a corrosion engineer could readily develop a neural network model such as the one described below for any problem at hand, given that sufficient experimental data exist. This applies even in the cases when the understanding of the underlying processes is poor. The second problem arises from cases when all the required inputs for a model are not known or can be estimated with a limited degree of accuracy. It seems advantageous to have models that can take as input a range rather than a single value. One such model, based on the so-called Monte Carlo approach, is presented. A number of comparisons are shown which have illustrated how a corrosion engineer might use this approach to rapidly test the sensitivity of a model to the uncertainities associated with the input parameters. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Observations of accelerating seismic activity prior to large earthquakes in natural fault systems have raised hopes for intermediate-term eartquake forecasting. If this phenomena does exist, then what causes it to occur? Recent theoretical work suggests that the accelerating seismic release sequence is a symptom of increasing long-wavelength stress correlation in the fault region. A more traditional explanation, based on Reid's elastic rebound theory, argues that an accelerating sequence of seismic energy release could be a consequence of increasing stress in a fault system whose stress moment release is dominated by large events. Both of these theories are examined using two discrete models of seismicity: a Burridge-Knopoff block-slider model and an elastic continuum based model. Both models display an accelerating release of seismic energy prior to large simulated earthquakes. In both models there is a correlation between the rate of seismic energy release with the total root-mean-squared stress and the level of long-wavelength stress correlation. Furthermore, both models exhibit a systematic increase in the number of large events at high stress and high long-wavelength stress correlation levels. These results suggest that either explanation is plausible for the accelerating moment release in the models examined. A statistical model based on the Burridge-Knopoff block-slider is constructed which indicates that stress alone is sufficient to produce accelerating release of seismic energy with time prior to a large earthquake.
Resumo:
We focus on mixtures of factor analyzers from the perspective of a method for model-based density estimation from high-dimensional data, and hence for the clustering of such data. This approach enables a normal mixture model to be fitted to a sample of n data points of dimension p, where p is large relative to n. The number of free parameters is controlled through the dimension of the latent factor space. By working in this reduced space, it allows a model for each component-covariance matrix with complexity lying between that of the isotropic and full covariance structure models. We shall illustrate the use of mixtures of factor analyzers in a practical example that considers the clustering of cell lines on the basis of gene expressions from microarray experiments. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
In this paper, we present the results of a qualitative study of subordinate perceptions of leaders. The study represents a preliminary test of a model based on Affective Events Theory, which posits that leaders who are seen to be effective shape the affective events that determine employees' attitudes and behaviours in the workplace. Within this framework, we argue that effective leaders ameliorate employees' hassles by providing frequent, small emotional uplifts. The resulting positive affective states are then proposed to lead to more positive employee attitudes and behaviours, and more positive regard for the leader. Importantly, leaders who demonstrate these ameliorating behaviours are likely to require high levels of emotional intelligence, defined in terms of the ability to recognise, understand, and manage emotions in self and others. To investigate this model, we conducted interviews and focus groups with 10 leaders and 24 employees. Results confirmed that these processes do indeed exist in the workplace. In particular, leaders who were seen by employees to provide continuous small emotional uplifts were consistently held to be the most effective. Study participants were especially affected by negative events (or hassles). Leaders who failed to deal with hassles or, worse still, were the source of hassles, were consistently seen to be less effective. We conclude with a discussion of implications for practicing managers, and suggest that our exploratory findings provide justification for emotional intelligence training as a means to improve leader perceptions and effectiveness. [Abstract from author]
Resumo:
Abstract — The analytical methods based on evaluation models of interactive systems were proposed as an alternative to user testing in the last stages of the software development due to its costs. However, the use of isolated behavioral models of the system limits the results of the analytical methods. An example of these limitations relates to the fact that they are unable to identify implementation issues that will impact on usability. With the introduction of model-based testing we are enable to test if the implemented software meets the specified model. This paper presents an model-based approach for test cases generation from the static analysis of source code.
Resumo:
Graphical user interfaces (GUIs) make software easy to use by providing the user with visual controls. Therefore, correctness of GUI's code is essential to the correct execution of the overall software. Models can help in the evaluation of interactive applications by allowing designers to concentrate on its more important aspects. This paper presents a generic model for language-independent reverse engineering of graphical user interface based applications, and we explore the integration of model-based testing techniques in our approach, thus allowing us to perform fault detection. A prototype tool has been constructed, which is already capable of deriving and testing a user interface behavioral model of applications written in Java/Swing.