998 resultados para Concept vector


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose The purpose of this paper is to explore the concept of service quality for settings where several customers are involved in the joint creation and consumption of a service. The approach is to provide first insights into the implications of a simultaneous multi‐customer integration on service quality. Design/methodology/approach This conceptual paper undertakes a thorough review of the relevant literature before developing a conceptual model regarding service co‐creation and service quality in customer groups. Findings Group service encounters must be set up carefully to account for the dynamics (social activity) in a customer group and skill set and capabilities (task activity) of each of the individual participants involved in a group service experience. Research limitations/implications Future research should undertake empirical studies to validate and/or modify the suggested model presented in this contribution. Practical implications Managers of service firms should be made aware of the implications and the underlying factors of group services in order to create and manage a group experience successfully. Particular attention should be given to those factors that can be influenced by service providers in managing encounters with multiple customers. Originality/value This article introduces a new conceptual approach for service encounters with groups of customers in a proposed service quality model. In particular, the paper focuses on integrating the impact of customers' co‐creation activities on service quality in a multiple‐actor model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose The purpose of this research is to examine the concept of “potential quality” – that is, a company's tangible search qualities (such as the physical servicescape and virtual servicescape) – within the context of the real‐estate industry in the USA. Design/methodology/approach This qualitative study collects data by conducting personal in‐depth interviews with 34 respondents who had been recent buyers or renters of property. The data are then coded and themed to identify quality dimensions relevant to this industry. Findings The results indicate that a buyer's perception of the overall service quality of real‐estate service consists of two components: the interaction with a realtor (process quality); and the virtual servicescape, especially the firm's website design and content (potential quality). The study concludes that existing scales (such as SERVQUAL and RESERV) fail to capture the tangible component of service quality sufficiently in the real‐estate industry. Research limitations/implications The study uses data from only one industry (real estate) and from only one demographic segment (professionals in higher education). Practical implications Service providers of intangible, high‐contact services must appreciate the importance of the virtual servicescape as a surrogate quality indicator that can help to reduce information asymmetries and consumers' uncertainty with regard to initiating a business relationship. Real estate firms need to pay attention to the training of agents and the design and content of their e‐service systems. Originality/value This study integrates potential quality, process quality, and outcome quality in a comprehensive proposed model. In particular, the study identifies “potential quality” as a combination of the attributes of the virtual service environment and the physical service environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modelling fluvial processes is an effective way to reproduce basin evolution and to recreate riverbed morphology. However, due to the complexity of alluvial environments, deterministic modelling of fluvial processes is often impossible. To address the related uncertainties, we derive a stochastic fluvial process model on the basis of the convective Exner equation that uses the statistics (mean and variance) of river velocity as input parameters. These statistics allow for quantifying the uncertainty in riverbed topography, river discharge and position of the river channel. In order to couple the velocity statistics and the fluvial process model, the perturbation method is employed with a non-stationary spectral approach to develop the Exner equation as two separate equations: the first one is the mean equation, which yields the mean sediment thickness, and the second one is the perturbation equation, which yields the variance of sediment thickness. The resulting solutions offer an effective tool to characterize alluvial aquifers resulting from fluvial processes, which allows incorporating the stochasticity of the paleoflow velocity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The window of opportunity is a concept critical to rheumatoid arthritis treatment. Early treatment changes the outcome of rheumatoid arthritis treatment, in that response rates are higher with earlier disease-modifying anti-rheumatic drug treatment and damage is substantially reduced. Axial spondyloarthritis is an inflammatory axial disease encompassing both nonradiographic axial spondyloarthritis and established ankylosing spondylitis. In axial spondyloarthritis, studies of magnetic resonance imaging as well as tumor necrosis factor inhibitor treatment and withdrawal studies all suggest that early effective suppression of inflammation has the potential to reduce radiographic damage. This potential would suggest that the concept of a window of opportunity is relevant not only to rheumatoid arthritis but also to axial spondyloarthritis. The challenge now remains to identify high-risk patients early and to commence treatment without delay. Developments in risk stratification include new classification criteria, identification of clinical risk factors, biomarkers, genetic associations, potential antibody associations and an ankylosing spondylitis-specific microbiome signature. Further research needs to focus on the evidence for early intervention and the early identification of high-risk individuals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of energy gap(s) is useful for understanding the consequence of a small daily, weekly, or monthly positive energy balance and the inconspicuous shift in weight gain ultimately leading to overweight and obesity. Energy gap is a dynamic concept: an initial positive energy gap incurred via an increase in energy intake (or a decrease in physical activity) is not constant, may fade out with time if the initial conditions are maintained, and depends on the 'efficiency' with which the readjustment of the energy imbalance gap occurs with time. The metabolic response to an energy imbalance gap and the magnitude of the energy gap(s) can be estimated by at least two methods, i.e. i) assessment by longitudinal overfeeding studies, imposing (by design) an initial positive energy imbalance gap; ii) retrospective assessment based on epidemiological surveys, whereby the accumulated endogenous energy storage per unit of time is calculated from the change in body weight and body composition. In order to illustrate the difficulty of accurately assessing an energy gap we have used, as an illustrative example, a recent epidemiological study which tracked changes in total energy intake (estimated by gross food availability) and body weight over 3 decades in the US, combined with total energy expenditure prediction from body weight using doubly labelled water data. At the population level, the study attempted to assess the cause of the energy gap purported to be entirely due to increased food intake. Based on an estimate of change in energy intake judged to be more reliable (i.e. in the same study population) and together with calculations of simple energetic indices, our analysis suggests that conclusions about the fundamental causes of obesity development in a population (excess intake vs. low physical activity or both) is clouded by a high level of uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background In 2011, a variant of West Nile virus Kunjin strain (WNVKUN) caused an unprecedented epidemic of neurological disease in horses in southeast Australia, resulting in almost 1,000 cases and a 9% fatality rate. We investigated whether increased fitness of the virus in the primary vector, Culex annulirostris, and another potential vector, Culex australicus, contributed to the widespread nature of the outbreak. Methods Mosquitoes were exposed to infectious blood meals containing either the virus strain responsible for the outbreak, designated WNVKUN2011, or WNVKUN2009, a strain of low virulence that is typical of historical strains of this virus. WNVKUN infection in mosquito samples was detected using a fixed cell culture enzyme immunoassay and a WNVKUN- specific monoclonal antibody. Probit analysis was used to determine mosquito susceptibility to infection. Infection, dissemination and transmission rates for selected days post-exposure were compared using Fisher’s exact test. Virus titers in bodies and saliva expectorates were compared using t-tests. Results There were few significant differences between the two virus strains in the susceptibility of Cx. annulirostris to infection, the kinetics of virus replication and the ability of this mosquito species to transmit either strain. Both strains were transmitted by Cx. annulirostris for the first time on day 5 post-exposure. The highest transmission rates (proportion of mosquitoes with virus detected in saliva) observed were 68% for WNVKUN2011 on day 12 and 72% for WNVKUN2009 on day 14. On days 12 and 14 post-exposure, significantly more WNVKUN2011 than WNVKUN2009 was expectorated by infected mosquitoes. Infection, dissemination and transmission rates of the two strains were not significantly different in Culex australicus. However, transmission rates and the amount of virus expectorated were significantly lower in Cx. australicus than Cx. annulirostris. Conclusions The higher amount of WNVKUN2011 expectorated by infected mosquitoes may be an indication that this virus strain is transmitted more efficiently by Cx. annulirostris compared to other WNVKUN strains. Combined with other factors, such as a convergence of abundant mosquito and wading bird populations, and mammalian and avian feeding behaviour by Cx. annulirostris, this may have contributed to the scale of the 2011 equine epidemic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Concept mapping involves determining relevant concepts from a free-text input, where concepts are defined in an external reference ontology. This is an important process that underpins many applications for clinical information reporting, derivation of phenotypic descriptions, and a number of state-of-the-art medical information retrieval methods. Concept mapping can be cast into an information retrieval (IR) problem: free-text mentions are treated as queries and concepts from a reference ontology as the documents to be indexed and retrieved. This paper presents an empirical investigation applying general-purpose IR techniques for concept mapping in the medical domain. A dataset used for evaluating medical information extraction is adapted to measure the effectiveness of the considered IR approaches. Standard IR approaches used here are contrasted with the effectiveness of two established benchmark methods specifically developed for medical concept mapping. The empirical findings show that the IR approaches are comparable with one benchmark method but well below the best benchmark.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Species identification based on short sequences of DNA markers, that is, DNA barcoding, has emerged as an integral part of modern taxonomy. However, software for the analysis of large and multilocus barcoding data sets is scarce. The Basic Local Alignment Search Tool (BLAST) is currently the fastest tool capable of handling large databases (e.g. >5000 sequences), but its accuracy is a concern and has been criticized for its local optimization. However, current more accurate software requires sequence alignment or complex calculations, which are time-consuming when dealing with large data sets during data preprocessing or during the search stage. Therefore, it is imperative to develop a practical program for both accurate and scalable species identification for DNA barcoding. In this context, we present VIP Barcoding: a user-friendly software in graphical user interface for rapid DNA barcoding. It adopts a hybrid, two-stage algorithm. First, an alignment-free composition vector (CV) method is utilized to reduce searching space by screening a reference database. The alignment-based K2P distance nearest-neighbour method is then employed to analyse the smaller data set generated in the first stage. In comparison with other software, we demonstrate that VIP Barcoding has (i) higher accuracy than Blastn and several alignment-free methods and (ii) higher scalability than alignment-based distance methods and character-based methods. These results suggest that this platform is able to deal with both large-scale and multilocus barcoding data with accuracy and can contribute to DNA barcoding for modern taxonomy. VIP Barcoding is free and available at http://msl.sls.cuhk.edu.hk/vipbarcoding/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Although lentiviral vectors have been widely used for in vitro and in vivo gene therapy researches, there have been few studies systematically examining various conditions that may affect the determination of the number of viable vector particles in a vector preparation and the use of Multiplicity of Infection (MOI) as a parameter for the prediction of gene transfer events. Methods: Lentiviral vectors encoding a marker gene were packaged and supernatants concentrated. The number of viable vector particles was determined by in vitro transduction and fluorescent microscopy and FACs analyses. Various factors that may affect the transduction process, such as vector inoculum volume, target cell number and type, vector decay, variable vector - target cell contact and adsorption periods were studied. MOI between 0-32 was assessed on commonly used cell lines as well as a new cell line. Results: We demonstrated that the resulting values of lentiviral vector titre varied with changes of conditions in the transduction process, including inoculum volume of the vector, the type and number of target cells, vector stability and the length of period of the vector adsorption to target cells. Vector inoculum and the number of target cells determine the frequencies of gene transfer event, although not proportionally. Vector exposure time to target cells also influenced transduction results. Varying these parameters resulted in a greater than 50-fold differences in the vector titre from the same vector stock. Commonly used cell lines in vector titration were less sensitive to lentiviral vector-mediated gene transfer than a new cell line, FRL 19. Within 0-32 of MOI used transducing four different cell lines, the higher the MOI applied, the higher the efficiency of gene transfer obtained. Conclusion: Several variables in the transduction process affected in in vitro vector titration and resulted in vastly different values from the same vector stock, thus complicating the use of MOI for predicting gene transfer events. Commonly used target cell lines underestimated vector titre. However, within a certain range of MOI, it is possible that, if strictly controlled conditions are observed in the vector titration process, including the use of a sensitive cell line, such as FRL 19 for vector titration, lentivector-mediated gene transfer events could be predicted. © 2004 Zhang et al; licensee BioMed Central Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Patents provide monopoly rights to patent holders. There are safeguards in patent regime to ensure that exclusive right of the patent holder is not misused. Compulsory licensing is one of the safeguards provided under TRIPS using which patent granting state may allow a third party to exploit the invention without patent holder’s consent upon terms and conditions decided by the government. This concept existed since 1623 and was not introduced by TRIPS for the first time. But this mechanism has undergone significant changes especially in post-TRIPS era. History of evolution of compulsory licensing is one of the least explored areas of intellectual property law. This paper undertakes an analysis of different phases in the evolution of the compulsory licensing mechanism and sheds light on reasons behind developments especially after TRIPS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, downscaling models are developed using a support vector machine (SVM) for obtaining projections of monthly mean maximum and minimum temperatures (T-max and T-min) to river-basin scale. The effectiveness of the model is demonstrated through application to downscale the predictands for the catchment of the Malaprabha reservoir in India, which is considered to be a climatically sensitive region. The probable predictor variables are extracted from (1) the National Centers for Environmental Prediction (NCEP) reanalysis dataset for the period 1978-2000, and (2) the simulations from the third-generation Canadian Coupled Global Climate Model (CGCM3) for emission scenarios A1B, A2, B1 and COMMIT for the period 1978-2100. The predictor variables are classified into three groups, namely A, B and C. Large-scale atmospheric variables Such as air temperature, zonal and meridional wind velocities at 925 nib which are often used for downscaling temperature are considered as predictors in Group A. Surface flux variables such as latent heat (LH), sensible heat, shortwave radiation and longwave radiation fluxes, which control temperature of the Earth's surface are tried as plausible predictors in Group B. Group C comprises of all the predictor variables in both the Groups A and B. The scatter plots and cross-correlations are used for verifying the reliability of the simulation of the predictor variables by the CGCM3 and to Study the predictor-predictand relationships. The impact of trend in predictor variables on downscaled temperature was studied. The predictor, air temperature at 925 mb showed an increasing trend, while the rest of the predictors showed no trend. The performance of the SVM models that are developed, one for each combination of predictor group, predictand, calibration period and location-based stratification (land, land and ocean) of climate variables, was evaluated. In general, the models which use predictor variables pertaining to land surface improved the performance of SVM models for downscaling T-max and T-min

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While frame-invariant solutions for arbitrarily large rotational deformations have been reported through the orthogonal matrix parametrization, derivation of such solutions purely through a rotation vector parametrization, which uses only three parameters and provides a parsimonious storage of rotations, is novel and constitutes the subject of this paper. In particular, we employ interpolations of relative rotations and a new rotation vector update for a strain-objective finite element formulation in the material framework. We show that the update provides either the desired rotation vector or its complement. This rules out an additive interpolation of total rotation vectors at the nodes. Hence, interpolations of relative rotation vectors are used. Through numerical examples, we show that combining the proposed update with interpolations of relative rotations yields frame-invariant and path-independent numerical solutions. Advantages of the present approach vis-a-vis the updated Lagrangian formulation are also analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the conventional MOSFET's scaling is approaching the limit imposed by short channel effects, Double Gate (DG) MOS transistors are appearing as the most feasible candidate in terms of technology in sub-45nm technology nodes. As the short channel effect in DG transistor is controlled by the device geometry, undoped or lightly doped body is used to sustain the channel. There exits a disparity in threshold voltage calculation criteria of undoped-body symmetric double gate transistors which uses two definitions, one is potential based and the another is charge based definition. In this paper, a novel concept of "crossover point'' is introduced, which proves that the charge-based definition is more accurate than the potential based definition.The change in threshold voltage with body thickness variation for a fixed channel length is anomalous as predicted by potential based definition while it is monotonous for charge based definition.The threshold voltage is then extracted from drain currant versus gate voltage characteristics using linear extrapolation and "Third Derivative of Drain-Source Current'' method or simply "TD'' method. The trend of threshold voltage variation is found same in both the cases which support charge-based definition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the conventional MOSFETs scaling is approaching the limit imposed by short channel effects, Double Gate (DG) MOS transistors are appearing as the most feasible andidate in terms of technology in sub-45nm technology nodes. As the short channel effect in DG transistor is controlled by the device geometry, undoped or lightly doped body, is used to sustain the channel. There exits a disparity in threshold voltage calculation criteria of undoped-body symmetric double gate transistors which uses two definitions, one is potential based and the another is charge based definition. In this paper, a novel concept of "crossover point" is introduced, which proves that the charge-based definition is more accurate than the potential based definition. The change in threshold voltage with body thickness variation for a fixed channel length is anomalous as predicted by, potential based definition while it is monotonous for change based definition. The threshold voltage is then extracted from drain currant versus gate voltage characteristics using linear extrapolation and "Third Derivative of Drain-Source Current" method or simply "TD" method. The trend of threshold voltage variation is found some in both the cases which support charge-based definition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigates the potential of Relevance Vector Machine (RVM)-based approach to predict the ultimate capacity of laterally loaded pile in clay. RVM is a sparse approximate Bayesian kernel method. It can be seen as a probabilistic version of support vector machine. It provides much sparser regressors without compromising performance, and kernel bases give a small but worthwhile improvement in performance. RVM model outperforms the two other models based on root-mean-square-error (RMSE) and mean-absolute-error (MAE) performance criteria. It also stimates the prediction variance. The results presented in this paper clearly highlight that the RVM is a robust tool for prediction Of ultimate capacity of laterally loaded piles in clay.