8 resultados para Little Higgs model

em Aston University Research Archive


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge management (KM) is an emerging discipline (Ives, Torrey & Gordon, 1997) and characterised by four processes: generation, codification, transfer, and application (Alavi & Leidner, 2001). Completing the loop, knowledge transfer is regarded as a precursor to knowledge creation (Nonaka & Takeuchi, 1995) and thus forms an essential part of the knowledge management process. The understanding of how knowledge is transferred is very important for explaining the evolution and change in institutions, organisations, technology, and economy. However, knowledge transfer is often found to be laborious, time consuming, complicated, and difficult to understand (Huber, 2001; Szulanski, 2000). It has received negligible systematic attention (Huber, 2001; Szulanski, 2000), thus we know little about it (Huber, 2001). However, some literature, such as Davenport and Prusak (1998) and Shariq (1999), has attempted to address knowledge transfer within an organisation, but studies on inter-organisational knowledge transfer are still much neglected. An emergent view is that it may be beneficial for organisations if more research can be done to help them understand and, thus, to improve their inter-organisational knowledge transfer process. Therefore, this article aims to provide an overview of the inter-organisational knowledge transfer and its related literature and present a proposed inter-organisational knowledge transfer process model based on theoretical and empirical studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although techniques such as biopanning rely heavily upon the screening of randomized gene libraries, there is surprisingly little information available on the construction of those libraries. In general, it is based on the cloning of 'randomized' synthetic oligonucleotides, in which given position(s) contain an equal mixture of all four bases. Yet, many supposedly 'randomized' libraries contain significant elements of bias and/or omission. Here, we report the development and validation of a new, PCR-based assay that enables rapid examination of library composition both prior to and after cloning. By using our assay to analyse model libraries, we demonstrate that the cloning of a given distribution of sequences does not necessarily result in a similarly composed library of clones. Thus, while bias in randomized synthetic oligonucleotide mixtures can be virtually eliminated by using unequal ratios of the four phosphoramidites, the use of such mixtures does not ensure retrieval of a truly randomized library. We propose that in the absence of a technique to control cloning frequencies, the ability to analyse the composition of libraries after cloning will enhance significantly the quality of information derived from those libraries. (C) 2000 Published by Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adapting to blurred or sharpened images alters perceived blur of a focused image (M. A. Webster, M. A. Georgeson, & S. M. Webster, 2002). We asked whether blur adaptation results in (a) renormalization of perceived focus or (b) a repulsion aftereffect. Images were checkerboards or 2-D Gaussian noise, whose amplitude spectra had (log-log) slopes from -2 (strongly blurred) to 0 (strongly sharpened). Observers adjusted the spectral slope of a comparison image to match different test slopes after adaptation to blurred or sharpened images. Results did not show repulsion effects but were consistent with some renormalization. Test blur levels at and near a blurred or sharpened adaptation level were matched by more focused slopes (closer to 1/f) but with little or no change in appearance after adaptation to focused (1/f) images. A model of contrast adaptation and blur coding by multiple-scale spatial filters predicts these blur aftereffects and those of Webster et al. (2002). A key proposal is that observers are pre-adapted to natural spectra, and blurred or sharpened spectra induce changes in the state of adaptation. The model illustrates how norms might be encoded and recalibrated in the visual system even when they are represented only implicitly by the distribution of responses across multiple channels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we investigate whether consideration of store-level heterogeneity in marketing mix effects improves the accuracy of the marketing mix elasticities, fit, and forecasting accuracy of the widely-applied SCAN*PRO model of store sales. Models with continuous and discrete representations of heterogeneity, estimated using hierarchical Bayes (HB) and finite mixture (FM) techniques, respectively, are empirically compared to the original model, which does not account for store-level heterogeneity in marketing mix effects, and is estimated using ordinary least squares (OLS). The empirical comparisons are conducted in two contexts: Dutch store-level scanner data for the shampoo product category, and an extensive simulation experiment. The simulation investigates how between- and within-segment variance in marketing mix effects, error variance, the number of weeks of data, and the number of stores impact the accuracy of marketing mix elasticities, model fit, and forecasting accuracy. Contrary to expectations, accommodating store-level heterogeneity does not improve the accuracy of marketing mix elasticities relative to the homogeneous SCAN*PRO model, suggesting that little may be lost by employing the original homogeneous SCAN*PRO model estimated using ordinary least squares. Improvements in fit and forecasting accuracy are also fairly modest. We pursue an explanation for this result since research in other contexts has shown clear advantages from assuming some type of heterogeneity in market response models. In an Afterthought section, we comment on the controversial nature of our result, distinguishing factors inherent to household-level data and associated models vs. general store-level data and associated models vs. the unique SCAN*PRO model specification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

WHAT IS ALREADY KNOWN ABOUT THIS SUBJECT • Little is known about the pharmacokinetics of potassium canrenoate/canrenone in paediatric patients WHAT THIS STUDY ADDS • A population pharmacokinetic model has been developed to evaluate the pharmacokinetics of canrenone in paediatric patients who received potassium canrenoate as part of their therapy in the intensive care unit. AIMS To characterize the population pharmacokinetics of canrenone following administration of potassium canrenoate to paediatric patients. METHODS Data were collected prospectively from 23 paediatric patients (2 days to 10 years of age; median weight 4 kg, range 2.16–28.0 kg) who received intravenous potassium canrenoate (K-canrenoate) as part of their intensive care therapy for removal of retained fluids, e.g. in pulmonary oedema due to chronic lung disease and for the management of congestive heart failure. Plasma samples were analyzed by HPLC for determination of canrenone (the major metabolite and pharmacologically active moiety) and the data subjected to pharmacokinetic analysis using NONMEM. RESULTS A one compartment model best described the data. The only significant covariate was weight (WT). The final population models for canrenone clearance (CL/F) and volume of distribution (V/F) were CL/F (l h−1) = 11.4 × (WT/70.0)0.75 and V/F (l) = 374.2 × (WT/70) where WT is in kg. The values of CL/F and V/F in a 4 kg child would be 1.33 l h−1 and 21.4 l, respectively, resulting in an elimination half-life of 11.2 h. CONCLUSIONS The range of estimated CL/F in the study population was 0.67–7.38 l h−1. The data suggest that adjustment of K-canrenoate dosage according to body weight is appropriate in paediatric patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study re-examines the one-dimensional equilibrium model of Gibilaro and Rowe (1974) for a segregating gas fluidized bed. The model was based on volumetric jetsam concentration and divided the bed contents into bulk and wake phases, taking account of bulk and wake flux, segregation, exchange between the bulk and wake phases, and axial mixing. Due to the complex nature of the model and its unstable solution, the lack of computing power at the time prevented the authors from doing little more than the analytical solutions to specific cases of this model. This paper provides a numerical total solution and allows the effect of the respective parameters to be compared for the first time. There is also a comparison with experimental results, which showed a reasonable agreement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Periconceptional environment may influence embryo development, ultimately affecting adult health. Here, we review the rodent model of maternal low-protein diet specifically during the preimplantation period (Emb-LPD) with normal nutrition during subsequent gestation and postnatally. This model, studied mainly in the mouse, leads to cardiovascular, metabolic and behavioural disease in adult offspring, with females more susceptible. We evaluate the sequence of events from diet administration that may lead to adult disease. Emb-LPD changes maternal serum and/or uterine fluid metabolite composition, notably with reduced insulin and branched-chain amino acids. This is sensed by blastocysts through reduced mammalian target of rapamycin complex 1 signalling. Embryos respond by permanently changing the pattern of development of their extra-embryonic lineages, trophectoderm and primitive endoderm, to enhance maternal nutrient retrieval during subsequent gestation. These compensatory changes include stimulation in proliferation, endocytosis and cellular motility, and epigenetic mechanisms underlying them are being identified. Collectively, these responses act to protect fetal growth and likely contribute to offspring competitive fitness. However, the resulting growth adversely affects long-term health because perinatal weight positively correlates with adult disease risk. We argue that periconception environmental responses reflect developmental plasticity and 'decisions' made by embryos to optimise their own development, but with lasting consequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contemporary models of contrast integration across space assume that pooling operates uniformly over the target region. For sparse stimuli, where high contrast regions are separated by areas containing no signal, this strategy may be sub-optimal because it pools more noise than signal as area increases. Little is known about the behaviour of human observers for detecting such stimuli. We performed an experiment in which three observers detected regular textures of various areas, and six levels of sparseness. Stimuli were regular grids of horizontal grating micropatches, each 1 cycle wide. We varied the ratio of signals (marks) to gaps (spaces), with mark:space ratios ranging from 1 : 0 (a dense texture with no spaces) to 1 : 24. To compensate for the decline in sensitivity with increasing distance from fixation, we adjusted the stimulus contrast as a function of eccentricity based on previous measurements [Baldwin, Meese & Baker, 2012, J Vis, 12(11):23]. We used the resulting area summation functions and psychometric slopes to test several filter-based models of signal combination. A MAX model failed to predict the thresholds, but did a good job on the slopes. Blanket summation of stimulus energy improved the threshold fit, but did not predict an observed slope increase with mark:space ratio. Our best model used a template matched to the sparseness of the stimulus, and pooled the squared contrast signal over space. Templates for regular patterns have also recently been proposed to explain the regular appearance of slightly irregular textures (Morgan et al, 2012, Proc R Soc B, 279, 2754–2760)