985 resultados para Usefulness


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The usefulness of any simulation of atmospheric tracers using low-resolution winds relies on both the dominance of large spatial scales in the strain and time dependence that results in a cascade in tracer scales. Here, a quantitative study on the accuracy of such tracer studies is made using the contour advection technique. It is shown that, although contour stretching rates are very insensitive to the spatial truncation of the wind field, the displacement errors in filament position are sensitive. A knowledge of displacement characteristics is essential if Lagrangian simulations are to be used for the inference of airmass origin. A quantitative lower estimate is obtained for the tracer scale factor (TSF): the ratio of the smallest resolved scale in the advecting wind field to the smallest “trustworthy” scale in the tracer field. For a baroclinic wave life cycle the TSF = 6.1 ± 0.3 while for the Northern Hemisphere wintertime lower stratosphere the TSF = 5.5 ± 0.5, when using the most stringent definition of the trustworthy scale. The similarity in the TSF for the two flows is striking and an explanation is discussed in terms of the activity of potential vorticity (PV) filaments. Uncertainty in contour initialization is investigated for the stratospheric case. The effect of smoothing initial contours is to introduce a spinup time, after which wind field truncation errors take over from initialization errors (2–3 days). It is also shown that false detail from the proliferation of finescale filaments limits the useful lifetime of such contour advection simulations to 3σ−1 days, where σ is the filament thinning rate, unless filaments narrower than the trustworthy scale are removed by contour surgery. In addition, PV analysis error and diabatic effects are so strong that only PV filaments wider than 50 km are at all believable, even for very high-resolution winds. The minimum wind field resolution required to accurately simulate filaments down to the erosion scale in the stratosphere (given an initial contour) is estimated and the implications for the modeling of atmospheric chemistry are briefly discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current mathematical models in building research have been limited in most studies to linear dynamics systems. A literature review of past studies investigating chaos theory approaches in building simulation models suggests that as a basis chaos model is valid and can handle the increasingly complexity of building systems that have dynamic interactions among all the distributed and hierarchical systems on the one hand, and the environment and occupants on the other. The review also identifies the paucity of literature and the need for a suitable methodology of linking chaos theory to mathematical models in building design and management studies. This study is broadly divided into two parts and presented in two companion papers. Part (I) reviews the current state of the chaos theory models as a starting point for establishing theories that can be effectively applied to building simulation models. Part (II) develops conceptual frameworks that approach current model methodologies from the theoretical perspective provided by chaos theory, with a focus on the key concepts and their potential to help to better understand the nonlinear dynamic nature of built environment systems. Case studies are also presented which demonstrate the potential usefulness of chaos theory driven models in a wide variety of leading areas of building research. This study distills the fundamental properties and the most relevant characteristics of chaos theory essential to building simulation scientists, initiates a dialogue and builds bridges between scientists and engineers, and stimulates future research about a wide range of issues on building environmental systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current mathematical models in building research have been limited in most studies to linear dynamics systems. A literature review of past studies investigating chaos theory approaches in building simulation models suggests that as a basis chaos model is valid and can handle the increasing complexity of building systems that have dynamic interactions among all the distributed and hierarchical systems on the one hand, and the environment and occupants on the other. The review also identifies the paucity of literature and the need for a suitable methodology of linking chaos theory to mathematical models in building design and management studies. This study is broadly divided into two parts and presented in two companion papers. Part (I), published in the previous issue, reviews the current state of the chaos theory models as a starting point for establishing theories that can be effectively applied to building simulation models. Part (II) develop conceptual frameworks that approach current model methodologies from the theoretical perspective provided by chaos theory, with a focus on the key concepts and their potential to help to better understand the nonlinear dynamic nature of built environment systems. Case studies are also presented which demonstrate the potential usefulness of chaos theory driven models in a wide variety of leading areas of building research. This study distills the fundamental properties and the most relevant characteristics of chaos theory essential to (1) building simulation scientists and designers (2) initiating a dialogue between scientists and engineers, and (3) stimulating future research on a wide range of issues involved in designing and managing building environmental systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We are developing computational tools supporting the detailed analysis of the dependence of neural electrophysiological response on dendritic morphology. We approach this problem by combining simulations of faithful models of neurons (experimental real life morphological data with known models of channel kinetics) with algorithmic extraction of morphological and physiological parameters and statistical analysis. In this paper, we present the novel method for an automatic recognition of spike trains in voltage traces, which eliminates the need for human intervention. This enables classification of waveforms with consistent criteria across all the analyzed traces and so it amounts to reduction of the noise in the data. This method allows for an automatic extraction of relevant physiological parameters necessary for further statistical analysis. In order to illustrate the usefulness of this procedure to analyze voltage traces, we characterized the influence of the somatic current injection level on several electrophysiological parameters in a set of modeled neurons. This application suggests that such an algorithmic processing of physiological data extracts parameters in a suitable form for further investigation of structure-activity relationship in single neurons.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We examine whether a three-regime model that allows for dormant, explosive and collapsing speculative behaviour can explain the dynamics of the S&P 500. We extend existing models of speculative behaviour by including a third regime that allows a bubble to grow at a steady rate, and propose abnormal volume as an indicator of the probable time of bubble collapse. We also examine the financial usefulness of the three-regime model by studying a trading rule formed using inferences from it, whose use leads to higher Sharpe ratios and end of period wealth than from employing existing models or a buy-and-hold strategy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper provides a new set of theoretical perspectives on the topic of value management in building procurement. On the evidence of the current literature it is possible to identify two distinct methodologies which are based on different epistemological positions. An argument is developed which sees these two methodologies to be complementary. A tentative meta-methodology is then outlined for matching methodologies to different problem situations. It is contended however that such a meta-methodology could never provide a prescriptive guide. Its usefulness lies in the way in which it provides the basis for reflective practice. Of central importance is the need to understand the problem context within which value management is to be applied. The distinctions between unitary, pluralistic and coercive situations are seen to be especially significant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identifying a periodic time-series model from environmental records, without imposing the positivity of the growth rate, does not necessarily respect the time order of the data observations. Consequently, subsequent observations, sampled in the environmental archive, can be inversed on the time axis, resulting in a non-physical signal model. In this paper an optimization technique with linear constraints on the signal model parameters is proposed that prevents time inversions. The activation conditions for this constrained optimization are based upon the physical constraint of the growth rate, namely, that it cannot take values smaller than zero. The actual constraints are defined for polynomials and first-order splines as basis functions for the nonlinear contribution in the distance-time relationship. The method is compared with an existing method that eliminates the time inversions, and its noise sensitivity is tested by means of Monte Carlo simulations. Finally, the usefulness of the method is demonstrated on the measurements of the vessel density, in a mangrove tree, Rhizophora mucronata, and the measurement of Mg/Ca ratios, in a bivalve, Mytilus trossulus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article introduces generalized beta-generated (GBG) distributions. Sub-models include all classical beta-generated, Kumaraswamy-generated and exponentiated distributions. They are maximum entropy distributions under three intuitive conditions, which show that the classical beta generator skewness parameters only control tail entropy and an additional shape parameter is needed to add entropy to the centre of the parent distribution. This parameter controls skewness without necessarily differentiating tail weights. The GBG class also has tractable properties: we present various expansions for moments, generating function and quantiles. The model parameters are estimated by maximum likelihood and the usefulness of the new class is illustrated by means of some real data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Currently, all pharmacists and technicians registered with the Royal Pharmaceutical Society of Great Britain must complete a minimum of nine Continuing Professional Development (CPD) record (entries) each year. From September 2010 a new regulatory body, the General Pharmaceutical Council, will oversee the regulation (including revalidation) of all pharmacy registrants in Great Britain. CPD may provide part of the supporting evidence that a practitioner submits to the regulator as part of the revalidation process. Gaps in knowledge necessitated further research to examine the usefulness of CPD in a pharmacy revalidation Project aims: The overall aims of this project were to summarise pharmacy professionals’ past involvement in CPD, examine the usability of current CPD entries for the purpose of revalidation, and to examine the impact of ‘revalidation standards’ and a bespoke Outcomes Framework on the conduct and construction of CPD entries for future revalidation of pharmacy professionals. We completed a comprehensive review of the literature, devised, validated and tested the impact of a new CPD Outcomes Framework and related training material in an empirical investigation involving volunteer pharmacy professionals and also spoke with our participants to bring meaning and understanding to the process of CPD conduct and recording and to gain feedback on the study itself. Key findings: The comprehensive literature review identified perceived barriers to CPD and resulted in recommendations that could potentially rectify pharmacy professionals’ perceptions and facilitate participation in CPD. The CPD Outcomes Framework can be used to score CPD entries Compared to a control (CPD and ‘revalidation standards’ only), we found that training participants to apply the CPD Outcomes Framework resulted in entries that scored significantly higher in the context of a quantitative method of CPD assessment. Feedback from participants who had received the CPD Outcomes Framework was positive and a number of useful suggestions were made about improvements to the Framework and related training. Entries scored higher because participants had consciously applied concepts linked to the CPD Outcomes Framework whereas entries scored low where participants had been unable to apply the concepts of the Framework for a variety of reasons including limitations posed by the ‘Plan & Record’ template. Feedback about the nature of the ‘revalidation standards’ and their application to CPD was not positive and participants had not in the main sought to apply the standards to their CPD entries – but those in the intervention group were more likely to have referred to the revalidation standards for their CPD. As assessors, we too found the process of selecting and assigning ‘revalidation standards’ to individual CPD entries burdensome and somewhat unspecific. We believe that addressing the perceived barriers and drawing on the facilitators will help deal with the apparent lack of engagement with the revalidation standards and have been able to make a set of relevant recommendations. We devised a model to explain and tell the story of CPD behaviour. Based on the concepts of purpose, action and results, the model centres on explaining two types of CPD behaviour, one following the traditional CE pathway and the other a more genuine CPD pathway. Entries which scored higher when we applied the CPD Outcomes Framework were more likely to follow the CPD pathway in the model above. Significant to our finding is that while participants following both models of practice took part in this study, the CPD Outcomes Framework was able to change people’s CPD behaviour to make it more inline with the CPD pathway. The CPD Outcomes Framework in defining the CPD criteria, the training pack in teaching the basis and use of the Framework and the process of assessment in using the CPD Outcomes Framework, would have interacted to improve participants’ CPD through a collective process. Participants were keen to receive a curriculum against which certainly CE-type activities could be conducted and another important observation relates to whether CE has any role to play in pharmacy professionals’ revalidation. We would recommend that the CPD Outcomes Framework is used in the revalidation of pharmacy professionals in the future provided the requirement to submit 9 CPD entries per annum is re-examined and expressed more clearly in relation to what specifically participants are being asked to submit – i.e. the ratio of CE to CPD entries. We can foresee a benefit in setting more regular intervals which would act as deadlines for CPD submission in the future. On the whole, there is value in using CPD for the purpose of pharmacy professionals’ revalidation in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to identify the factors influencing adoption of technologies promoted by government to small-scale dairy farmers in the highlands of central Mexico, a field survey was conducted. A total of 115 farmers were grouped through cluster analysis (CA) and divided into three wealth status categories (high, medium and low) using wealth ranking. Chi-square analysis was used to examine the association of wealth status with technology adoption. Four groups of farms were differentiated in terms of farms’ dimensions, farmers’ education, sources of incomes, wealth status, management of herd, monetary support by government and technological availability. Statistical differences (p < 0.05) were observed in the milk yield per herd per year among groups. Government organizations (GO) participated little in the promotion of the 17 technologies identified, six of which focused on crop or forage production and 11 of which were related to animal husbandry. Relatives and other farmers played an important role in knowledge diffusion and technology adoption. Although wealth status had a significant association (p < 0.05) with adoption, other factors including importance of the technology to farmers, usefulness and productive benefits of innovations together with farmers’ knowledge of them, were important. It is concluded that the analysis of the information per group and wealth status was useful to identify suitable crop or forage related and animal husbandry technologies per group and wealth status of farmers. Therefore the characterizations of farmers could provide a useful starting point for the design and delivery of more appropriate and effective extension.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

International competitiveness ultimately depends upon the linkages between a firm’s unique, idiosyncratic capabilities (firm-specific advantages, FSAs) and its home country assets (country-specific advantages, CSAs). In this paper, we present a modified FSA/CSA matrix building upon the FSA/CSA matrix (Rugman 1981). We relate this to the diamond framework for national competitiveness (Porter 1990), and the double diamond model (Rugman and D’Cruz 1993). We provide empirical evidence to demonstrate the merits and usefulness of the modified FSA/CSA matrix using the Fortune Global 500 firms. We examine the FSAs based on the geographic scope of sales and CSAs that can lead to national, home region, and global competitiveness. Our empirical analysis suggests that the world’s largest 500 firms have increased their firm-level international competitiveness. However, much of this is still being achieved within their home region. In other words, international competitiveness is a regional not a global phenomenon. Our findings have significant implications for research and practice. Future research in international marketing should take into account the multi-faceted nature of FSAs and CSAs across different levels. For MNE managers, our study provides useful insights for strategic marketing planning and implementation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

If secondary structure predictions are to be incorporated into fold recognition methods, an assessment of the effect of specific types of errors in predicted secondary structures on the sensitivity of fold recognition should be carried out. Here, we present a systematic comparison of different secondary structure prediction methods by measuring frequencies of specific types of error. We carry out an evaluation of the effect of specific types of error on secondary structure element alignment (SSEA), a baseline fold recognition method. The results of this evaluation indicate that missing out whole helix or strand elements, or predicting the wrong type of element, is more detrimental than predicting the wrong lengths of elements or overpredicting helix or strand. We also suggest that SSEA scoring is an effective method for assessing accuracy of secondary structure prediction and perhaps may also provide a more appropriate assessment of the “usefulness” and quality of predicted secondary structure, if secondary structure alignments are to be used in fold recognition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The elucidation of the domain content of a given protein sequence in the absence of determined structure or significant sequence homology to known domains is an important problem in structural biology. Here we address how successfully the delineation of continuous domains can be accomplished in the absence of sequence homology using simple baseline methods, an existing prediction algorithm (Domain Guess by Size), and a newly developed method (DomSSEA). The study was undertaken with a view to measuring the usefulness of these prediction methods in terms of their application to fully automatic domain assignment. Thus, the sensitivity of each domain assignment method was measured by calculating the number of correctly assigned top scoring predictions. We have implemented a new continuous domain identification method using the alignment of predicted secondary structures of target sequences against observed secondary structures of chains with known domain boundaries as assigned by Class Architecture Topology Homology (CATH). Taking top predictions only, the success rate of the method in correctly assigning domain number to the representative chain set is 73.3%. The top prediction for domain number and location of domain boundaries was correct for 24% of the multidomain set (±20 residues). These results have been put into context in relation to the results obtained from the other prediction methods assessed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The estimation of prediction quality is important because without quality measures, it is difficult to determine the usefulness of a prediction. Currently, methods for ligand binding site residue predictions are assessed in the function prediction category of the biennial Critical Assessment of Techniques for Protein Structure Prediction (CASP) experiment, utilizing the Matthews Correlation Coefficient (MCC) and Binding-site Distance Test (BDT) metrics. However, the assessment of ligand binding site predictions using such metrics requires the availability of solved structures with bound ligands. Thus, we have developed a ligand binding site quality assessment tool, FunFOLDQA, which utilizes protein feature analysis to predict ligand binding site quality prior to the experimental solution of the protein structures and their ligand interactions. The FunFOLDQA feature scores were combined using: simple linear combinations, multiple linear regression and a neural network. The neural network produced significantly better results for correlations to both the MCC and BDT scores, according to Kendall’s τ, Spearman’s ρ and Pearson’s r correlation coefficients, when tested on both the CASP8 and CASP9 datasets. The neural network also produced the largest Area Under the Curve score (AUC) when Receiver Operator Characteristic (ROC) analysis was undertaken for the CASP8 dataset. Furthermore, the FunFOLDQA algorithm incorporating the neural network, is shown to add value to FunFOLD, when both methods are employed in combination. This results in a statistically significant improvement over all of the best server methods, the FunFOLD method (6.43%), and one of the top manual groups (FN293) tested on the CASP8 dataset. The FunFOLDQA method was also found to be competitive with the top server methods when tested on the CASP9 dataset. To the best of our knowledge, FunFOLDQA is the first attempt to develop a method that can be used to assess ligand binding site prediction quality, in the absence of experimental data.