901 resultados para Formal Methods. Component-Based Development. Competition. Model Checking


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, methods for computing D-optimal designs for population pharmacokinetic studies have become available. However there are few publications that have prospectively evaluated the benefits of D-optimality in population or single-subject settings. This study compared a population optimal design with an empirical design for estimating the base pharmacokinetic model for enoxaparin in a stratified randomized setting. The population pharmacokinetic D-optimal design for enoxaparin was estimated using the PFIM function (MATLAB version 6.0.0.88). The optimal design was based on a one-compartment model with lognormal between subject variability and proportional residual variability and consisted of a single design with three sampling windows (0-30 min, 1.5-5 hr and 11 - 12 hr post-dose) for all patients. The empirical design consisted of three sample time windows per patient from a total of nine windows that collectively represented the entire dose interval. Each patient was assigned to have one blood sample taken from three different windows. Windows for blood sampling times were also provided for the optimal design. Ninety six patients were recruited into the study who were currently receiving enoxaparin therapy. Patients were randomly assigned to either the optimal or empirical sampling design, stratified for body mass index. The exact times of blood samples and doses were recorded. Analysis was undertaken using NONMEM (version 5). The empirical design supported a one compartment linear model with additive residual error, while the optimal design supported a two compartment linear model with additive residual error as did the model derived from the full data set. A posterior predictive check was performed where the models arising from the empirical and optimal designs were used to predict into the full data set. This revealed the optimal'' design derived model was superior to the empirical design model in terms of precision and was similar to the model developed from the full dataset. This study suggests optimal design techniques may be useful, even when the optimized design was based on a model that was misspecified in terms of the structural and statistical models and when the implementation of the optimal designed study deviated from the nominal design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we propose a new identification method based on the residual white noise autoregressive criterion (Pukkila et al. , 1990) to select the order of VARMA structures. Results from extensive simulation experiments based on different model structures with varying number of observations and number of component series are used to demonstrate the performance of this new procedure. We also use economic and business data to compare the model structures selected by this order selection method with those identified in other published studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One common characteristic of breast cancers arising in carriers of the predisposition gene BRCA1 is a loss of expression of the CDK inhibitor p27(Kip1) (p27), suggesting that p27 interacts epistatically with BRCA1. To investigate this relationship, we examined expression of p27 in mice expressing a dominant negative allele of Brca1 (MMTV-trBr) in the mammary gland. While these mice rarely develop tumors, they showed a 50% increase in p27 protein and a delay in mammary gland development associated with reduced proliferation. In contrast, on a p27 heterozygote background, MMTV-trBrca1 mice showed an increase in S phase cells, and normal mammary development. p27 was the only protein in the cyclin cyclin-dependent kinase network to show altered expression, suggesting that it may be a central mediator of cell cycle arrest in response to loss of function of BRCA1. Furthermore, in human mammary epithelial MCF7 cells expressing BRCA1-specific RNAi and in the BRCA1-deficient human tumor cell line HCC1937, p27 is elevated at the mRNA level compared to cells expressing wild-type BRCA1. We hypothesize that disruption of BRCA1 induces an increase in p27 that inhibits proliferation. Accordingly, reduction in p27 expression leads to enhancement of cellular proliferation in the absence of BRCA1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Immunotherapy of tumours using T cells expanded in vitro has met with mixed clinical success suggesting that a greater understanding of tumour/T-cell interaction is required. We used a HPV16E7 oncoprotein-based mouse tumour model to study this further. In this study, we demonstrate that a HPV16E7 tumour passes through at least three stages of immune susceptibility over time. At the earliest time point, infusion of intravenous immune cells fails to control tumour growth although the same cells given subcutaneously at the tumour site are effective. In a second stage, the tumour becomes resistant to subcutaneous infusion of cells but is now susceptible to both adjuvant activated and HPV16E7-specific immune cells transferred intravenously. In the last phase, the tumour is susceptible to intravenous transfer of HPV16E7-specific cells, but not adjuvant-activated immune cells. The requirement for IFN-gamma and perforin also changes with each stage of tumour development. Our data suggest that effective adoptive T-cell therapy of tumour will need to be matched with the stage of tumour development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The particle-based lattice solid model developed to study the physics of rocks and the nonlinear dynamics of earthquakes is refined by incorporating intrinsic friction between particles. The model provides a means for studying the causes of seismic wave attenuation, as well as frictional heat generation, fault zone evolution, and localisation phenomena. A modified velocity-Verlat scheme that allows friction to be precisely modelled is developed. This is a difficult computational problem given that a discontinuity must be accurately simulated by the numerical approach (i.e., the transition from static to dynamical frictional behaviour). This is achieved using a half time step integration scheme. At each half time step, a nonlinear system is solved to compute the static frictional forces and states of touching particle-pairs. Improved efficiency is achieved by adaptively adjusting the time step increment, depending on the particle velocities in the system. The total energy is calculated and verified to remain constant to a high precision during simulations. Numerical experiments show that the model can be applied to the study of earthquake dynamics, the stick-slip instability, heat generation, and fault zone evolution. Such experiments may lead to a conclusive resolution of the heat flow paradox and improved understanding of earthquake precursory phenomena and dynamics. (C) 1999 Academic Press.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In component-based software engineering programs are constructed from pre-defined software library modules. However, if the library's subroutines do not exactly match the programmer's requirements, the subroutines' code must be adapted accordingly. For this process to be acceptable in safety or mission-critical applications, where all code must be proven correct, it must be possible to verify the correctness of the adaptations themselves. In this paper we show how refinement theory can be used to model typical adaptation steps and to define the conditions that must be proven to verify that a library subroutine has been adapted correctly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a new differential evolution (DE) based power system optimal available transfer capability (ATC) assessment is presented. Power system total transfer capability (TTC) is traditionally solved by the repeated power flow (RPF) method and the continuation power flow (CPF) method. These methods are based on the assumption that the productions of the source area generators are increased in identical proportion to balance the load increment in the sink area. A new approach based on DE algorithm to generate optimal dispatch both in source area generators and sink area loads is proposed in this paper. This new method can compute ATC between two areas with significant improvement in accuracy compared with the traditional RPF and CPF based methods. A case study using a 30 bus system is given to verify the efficiency and effectiveness of this new DE based ATC optimization approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As an alternative to traditional evolutionary algorithms (EAs), population-based incremental learning (PBIL) maintains a probabilistic model of the best individual(s). Originally, PBIL was applied in binary search spaces. Recently, some work has been done to extend it to continuous spaces. In this paper, we review two such extensions of PBIL. An improved version of the PBIL based on Gaussian model is proposed that combines two main features: a new updating rule that takes into account all the individuals and their fitness values and a self-adaptive learning rate parameter. Furthermore, a new continuous PBIL employing a histogram probabilistic model is proposed. Some experiments results are presented that highlight the features of the new algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A aceitação e o uso de Tecnologia da Informação (TI) pelo indivíduo têm sido estudadas por diferentes modelos conceituais que, em geral, derivaram de teorias da Psicologia como a TRA Theory of Reasoned Action e a TPB Theory of Planned Behavior, derivada da primeira. Um importante modelo de análise dai derivado, resultado da minuciosa análise de outros 8 modelos anteriores, o UTAUT - Unified Theory of Acceptance and Use of Technology de VENKATESH et. al. (2003) tem sido largamente analisado e validado em vários cenários de tecnologia e ambientes. Este trabalho visa compreender de uma maneira mais ampla dos fatores antecedentes da intenção de uso e comportamento de uso a partir do modelo UTAUT, bem como os fatores que melhores explicam a intenção e o comportamento de uso, assim como a análise de seus moderadores. Em seu desenvolvimento, Venkatesh et al. empreenderam comparações em três etapas de implantação e em dois cenários: na adoção mandatória, aquela em que se deu em ambiente empresarial onde o sistema é requerido para execução de processos e tomada de decisões, e na adoção voluntária, cenário em que a adoção se dá pelo indivíduo. No segundo caso, os autores concluíram que o fator influência social tem baixa magnitude e significância, não se revelando um fator importante na adoção da tecnologia. Este trabalho visa analisar também se o mesmo fenômeno ocorre para adoção que se dá de forma voluntária, mas passível de ser altamente influenciada pelos laços sociais, como o que ocorre entre usuários das redes sociais como Orkut, Facebook, Twitter e Linkedin, especialmente em tecnologias que habilitam ganhos associados ao exercício desses laços, como no caso do uso de sites de compras coletivas tais como Peixe Urbano, Groupon e Clickon. Com base no modelo UTAUT, foi aplicada uma pesquisa e posteriormente foram analisados os resultados de 292 respondentes validados que foram acessados por e-mails e redes sociais. A técnica de análise empregada consistiu do uso de modelagem por equações estruturais, com base no algoritmo PLS Partial Least Square, com bootstrap de 1000 reamostragens. Os resultados demonstraram alta magnitude e significância preditiva sobre a Intenção de uso da tecnologia pelos fatores de Expectativa de Desempenho (0,288@0,1%), Influência Social (0,176@0,1%). Os primeiro, compatível com estudos anteriores. Já a magnitude e significância do último fator resultou amplamente superior ao estudo original de Venkatesh et al. (2003) variando entre 0,02 a 0,04, não significante, dependendo dos dados estarem agrupados ou não (p.465). A principal conclusão deste estudo é que, ao considerarmos o fenômeno das compras coletivas, em um ambiente de adoção voluntária, portanto, o fator social é altamente influente na intenção de uso da tecnologia, o que contrasta fortemente com o estudo original do UTAUT (já que no estudo de Venkatesh et al. (2003) este fator não foi significante) e apresenta várias possibilidades de pesquisas futuras e possíveis implicações gerenciais.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Risk and knowledge are two concepts and components of business management which have so far been studied almost independently. This is especially true where risk management (RM) is conceived mainly in financial terms, as for example, in the financial institutions sector. Financial institutions are affected by internal and external changes with the consequent accommodation to new business models, new regulations and new global competition that includes new big players. These changes induce financial institutions to develop different methodologies for managing risk, such as the enterprise risk management (ERM) approach, in order to adopt a holistic view of risk management and, consequently, to deal with different types of risk, levels of risk appetite, and policies in risk management. However, the methodologies for analysing risk do not explicitly include knowledge management (KM). This research examines the potential relationships between KM and two RM concepts: perceived quality of risk control and perceived value of ERM. To fulfill the objective of identifying how KM concepts can have a positive influence on some RM concepts, a literature review of KM and its processes and RM and its processes was performed. From this literature review eight hypotheses were analysed using a classification into people, process and technology variables. The data for this research was gathered from a survey applied to risk management employees in financial institutions and 121 answers were analysed. The analysis of the data was based on multivariate techniques, more specifically stepwise regression analysis. The results showed that the perceived quality of risk control is significantly associated with the variables: perceived quality of risk knowledge sharing, perceived quality of communication among people, web channel functionality, and risk management information system functionality. However, the relationships of the KM variables to the perceived value of ERM are not identified because of the low performance of the models describing these relationships. The analysis reveals important insights into the potential KM support to RM such as: the better adoption of KM people and technology actions, the better the perceived quality of risk control. Equally, the results suggest that the quality of risk control and the benefits of ERM follow different patterns given that there is no correlation between both concepts and the distinct influence of the KM variables in each concept. The ERM scenario is different from that of risk control because ERM, as an answer to RM failures and adaptation to new regulation in financial institutions, has led organizations to adopt new processes, technologies, and governance models. Thus, the search for factors influencing the perceived value of ERM implementation needs additional analysis because what is improved in RM processes individually is not having the same effect on the perceived value of ERM. Based on these model results and the literature review the basis of the ERKMAS (Enterprise Risk Knowledge Management System) is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – To investigate the impact of performance measurement in strategic planning process. Design/methodology/approach – A large scale survey was conducted online with Warwick Business School alumni. The questionnaire was based on the Strategic Development Process model by Dyson. The questionnaire was designed to map the current practice of strategic planning and to determine its most influential factors on the effectiveness of the process. All questions were close ended and a seven-point Likert scale used. The independent variables were grouped into four meaningful factors by factor analysis (Varimax, coefficient of rotation 0.4). The factors produced were used to build regression models (stepwise) for the five assessments of strategic planning process. Regression models were developed for the totality of the responses, comparing SMEs and large organizations and comparing organizations operating in slowly and rapidly changing environments. Findings – The results indicate that performance measurement stands as one of the four main factors characterising the current practice of strategic planning. This research has determined that complexity coming from organizational size and rate of change in the sector creates variation in the impact of performance measurement in strategic planning. Large organizations and organizations operating in rapidly changing environments make greater use of performance measurement. Research limitations/implications – This research is based on subjective data, therefore the conclusions do not concern the impact of strategic planning process' elements on the organizational performance achievements, but on the success/effectiveness of the strategic planning process itself. Practical implications – This research raises a series of questions about the use and potential impact of performance measurement, especially in the categories of organizations that are not significantly influenced by its utilisation. It contributes to the field of performance measurement impact. Originality/value – This research fills in the gap literature concerning the lack of large scale surveys on strategic development processes and performance measurement. It also contributes in the literature of this field by providing empirical evidences on the impact of performance measurement upon the strategic planning process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fare, Grosskopf, Norris and Zhang developed a non-parametric productivity index, Malmquist index, using data envelopment analysis (DEA). The Malmquist index is a measure of productivity progress (regress) and it can be decomposed to different components such as 'efficiency catch-up' and 'technology change'. However, Malmquist index and its components are based on two period of time which can capture only a part of the impact of investment in long-lived assets. The effects of lags in the investment process on the capital stock have been ignored in the current model of Malmquist index. This paper extends the recent dynamic DEA model introduced by Emrouznejad and Thanassoulis and Emrouznejad for dynamic Malmquist index. This paper shows that the dynamic productivity results for Organisation for Economic Cooperation and Development countries should reflect reality better than those based on conventional model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this position paper we present the developing Fluid framework, which we believe offers considerable advantages in maintaining software stability in dynamic or evolving application settings. The Fluid framework facilitates the development of component software via the selection, composition and configuration of components. Fluid's composition language incorporates a high-level type system supporting object-oriented principles such as type description, type inheritance, and type instantiation. Object-oriented relationships are represented via the dynamic composition of component instances. This representation allows the software structure, as specified by type and instance descriptions, to change dynamically at runtime as existing types are modified and new types and instances are introduced. We therefore move from static software structure descriptions to more dynamic representations, while maintaining the expressiveness of object-oriented semantics. We show how the Fluid framework relates to existing, largely component based, software frameworks and conclude with suggestions for future enhancements. © 2007 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The inclusion of high-level scripting functionality in state-of-the-art rendering APIs indicates a movement toward data-driven methodologies for structuring next generation rendering pipelines. A similar theme can be seen in the use of composition languages to deploy component software using selection and configuration of collaborating component implementations. In this paper we introduce the Fluid framework, which places particular emphasis on the use of high-level data manipulations in order to develop component based software that is flexible, extensible, and expressive. We introduce a data-driven, object oriented programming methodology to component based software development, and demonstrate how a rendering system with a similar focus on abstract manipulations can be incorporated, in order to develop a visualization application for geospatial data. In particular we describe a novel SAS script integration layer that provides access to vertex and fragment programs, producing a very controllable, responsive rendering system. The proposed system is very similar to developments speculatively planned for DirectX 10, but uses open standards and has cross platform applicability. © The Eurographics Association 2007.