971 resultados para Generalized extreme value distribution
Resumo:
INTRODUCTION: Hyperglycemia is a metabolic alteration in major burn patients associated with complications. The study aimed at evaluating the safety of general ICU glucose control protocols applied in major burns receiving prolonged ICU treatment. METHODS: 15year retrospective analysis of consecutive, adult burn patients admitted to a single specialized centre. EXCLUSION CRITERIA: death or length of stay <10 days, age <16years. VARIABLES: demographic variables, burned surface (TBSA), severity scores, infections, ICU stay, outcome. Metabolic variables: total energy, carbohydrate and insulin delivery/24h, arterial blood glucose and CRP values. Analysis of 4 periods: 1, before protocol; 2, tight doctor driven; 3, tight nurse driven; 4, moderate nurse driven. RESULTS: 229 patients, aged 45±20 years (mean±SD), burned 32±20% TBSA were analyzed. SAPSII was 35±13. TBSA, Ryan and ABSI remained stable. Inhalation injury increased. A total of 28,690 blood glucose samples were analyzed: the median value remained unchanged with a narrower distribution over time. After the protocol initiation, the normoglycemic values increased from 34.7% to 65.9%, with a reduction of hypoglycaemic events (no extreme hypoglycemia in period 4). Severe hyperglycemia persisted throughout with a decrease in period 4 (9.25% in period 4). Energy and glucose deliveries decreased in periods 3 and 4 (p<0.0001). Infectious complications increased during the last 2 periods (p=0.01). CONCLUSION: A standardized ICU glucose control protocol improved the glycemic control in adult burn patients, reducing glucose variability. Moderate glycemic control in burns was safe specifically related to hypoglycemia, reducing the incidence of hypoglycaemic events compared to the period before. Hyperglycemia persisted at a lower level.
Resumo:
Georgia is known for its extraordinary rich biodiversity of plants, which may now be threatened due to the spread of invasive alien plants (IAP). We aimed to identify (i) the most prominent IAP out of 9 selected potentially invasive and harmful IAP IAP by predicting thetheir distribution of 9 selected IAP under current and future climate conditions in Georgia as well as in its 43 Protected Areas, as a proxy for areas of high conservation value and (ii) the Protected Areas most at risk due to these IAP. We used species distribution models based on 6 climate variables and then filtered the obtained distributions based on maps of soil and vegetation types, and on recorded occurrences, resulting into the predicted ecological distribution of the 9 IAP's at a resolution of 1km2. We foundOur habitat suitability analysis showed that Ambrosia artemisiifolia, (24% and 40%) Robinia pseudoacaia (14% and 19%) and Ailanthus altissima (9% and 11%) have the largest potential distribution are the most abundant (predicted % area covered)d) IAP, with Ailanthus altissima the potentially most increasing one over the next fifty years (from 9% to 13% and from 11% to 25%), for Georgia and the Protected Areas, respectively. Furthermore, our results show indicate two areas in Georgia that are under specifically high threat, i.e. the area around Tbilisi and an area in the western part of Georgia (Adjara), both at lower altitudes. Our procedure to identify areas of high conservation value most at risk by IAP has been applied for the first time. It will help national authorities in prioritizing their measures to protect Georgia's outstanding biodiversity from the negative impact of IAP.
Resumo:
Wastewater-based epidemiology consists in acquiring relevant information about the lifestyle and health status of the population through the analysis of wastewater samples collected at the influent of a wastewater treatment plant. Whilst being a very young discipline, it has experienced an astonishing development since its firs application in 2005. The possibility to gather community-wide information about drug use has been among the major field of application. The wide resonance of the first results sparked the interest of scientists from various disciplines. Since then, research has broadened in innumerable directions. Although being praised as a revolutionary approach, there was a need to critically assess its added value, with regard to the existing indicators used to monitor illicit drug use. The main, and explicit, objective of this research was to evaluate the added value of wastewater-based epidemiology with regards to two particular, although interconnected, dimensions of illicit drug use. The first is related to trying to understand the added value of the discipline from an epidemiological, or societal, perspective. In other terms, to evaluate if and how it completes our current vision about the extent of illicit drug use at the population level, and if it can guide the planning of future prevention measures and drug policies. The second dimension is the criminal one, with a particular focus on the networks which develop around the large demand in illicit drugs. The goal here was to assess if wastewater-based epidemiology, combined to indicators stemming from the epidemiological dimension, could provide additional clues about the structure of drug distribution networks and the size of their market. This research had also an implicit objective, which focused on initiating the path of wastewater- based epidemiology at the Ecole des Sciences Criminelles of the University of Lausanne. This consisted in gathering the necessary knowledge about the collection, preparation, and analysis of wastewater samples and, most importantly, to understand how to interpret the acquired data and produce useful information. In the first phase of this research, it was possible to determine that ammonium loads, measured directly in the wastewater stream, could be used to monitor the dynamics of the population served by the wastewater treatment plant. Furthermore, it was shown that on the long term, the population did not have a substantial impact on consumption patterns measured through wastewater analysis. Focussing on methadone, for which precise prescription data was available, it was possible to show that reliable consumption estimates could be obtained via wastewater analysis. This allowed to validate the selected sampling strategy, which was then used to monitor the consumption of heroin, through the measurement of morphine. The latter, in combination to prescription and sales data, provided estimates of heroin consumption in line with other indicators. These results, combined to epidemiological data, highlighted the good correspondence between measurements and expectations and, furthermore, suggested that the dark figure of heroin users evading harm-reduction programs, which would thus not be measured by conventional indicators, is likely limited. In the third part, which consisted in a collaborative study aiming at extensively investigating geographical differences in drug use, wastewater analysis was shown to be a useful complement to existing indicators. In particular for stigmatised drugs, such as cocaine and heroin, it allowed to decipher the complex picture derived from surveys and crime statistics. Globally, it provided relevant information to better understand the drug market, both from an epidemiological and repressive perspective. The fourth part focused on cannabis and on the potential of combining wastewater and survey data to overcome some of their respective limitations. Using a hierarchical inference model, it was possible to refine current estimates of cannabis prevalence in the metropolitan area of Lausanne. Wastewater results suggested that the actual prevalence is substantially higher compared to existing figures, thus supporting the common belief that surveys tend to underestimate cannabis use. Whilst being affected by several biases, the information collected through surveys allowed to overcome some of the limitations linked to the analysis of cannabis markers in wastewater (i.e., stability and limited excretion data). These findings highlighted the importance and utility of combining wastewater-based epidemiology to existing indicators about drug use. Similarly, the fifth part of the research was centred on assessing the potential uses of wastewater-based epidemiology from a law enforcement perspective. Through three concrete examples, it was shown that results from wastewater analysis can be used to produce highly relevant intelligence, allowing drug enforcement to assess the structure and operations of drug distribution networks and, ultimately, guide their decisions at the tactical and/or operational level. Finally, the potential to implement wastewater-based epidemiology to monitor the use of harmful, prohibited and counterfeit pharmaceuticals was illustrated through the analysis of sibutramine, and its urinary metabolite, in wastewater samples. The results of this research have highlighted that wastewater-based epidemiology is a useful and powerful approach with numerous scopes. Faced with the complexity of measuring a hidden phenomenon like illicit drug use, it is a major addition to the panoply of existing indicators. -- L'épidémiologie basée sur l'analyse des eaux usées (ou, selon sa définition anglaise, « wastewater-based epidemiology ») consiste en l'acquisition d'informations portant sur le mode de vie et l'état de santé d'une population via l'analyse d'échantillons d'eaux usées récoltés à l'entrée des stations d'épuration. Bien qu'il s'agisse d'une discipline récente, elle a vécu des développements importants depuis sa première mise en oeuvre en 2005, notamment dans le domaine de l'analyse des résidus de stupéfiants. Suite aux retombées médiatiques des premiers résultats de ces analyses de métabolites dans les eaux usées, de nombreux scientifiques provenant de différentes disciplines ont rejoint les rangs de cette nouvelle discipline en développant plusieurs axes de recherche distincts. Bien que reconnu pour son coté objectif et révolutionnaire, il était nécessaire d'évaluer sa valeur ajoutée en regard des indicateurs couramment utilisés pour mesurer la consommation de stupéfiants. En se focalisant sur deux dimensions spécifiques de la consommation de stupéfiants, l'objectif principal de cette recherche était focalisé sur l'évaluation de la valeur ajoutée de l'épidémiologie basée sur l'analyse des eaux usées. La première dimension abordée était celle épidémiologique ou sociétale. En d'autres termes, il s'agissait de comprendre si et comment l'analyse des eaux usées permettait de compléter la vision actuelle sur la problématique, ainsi que déterminer son utilité dans la planification des mesures préventives et des politiques en matière de stupéfiants actuelles et futures. La seconde dimension abordée était celle criminelle, en particulier, l'étude des réseaux qui se développent autour du trafic de produits stupéfiants. L'objectif était de déterminer si cette nouvelle approche combinée aux indicateurs conventionnels, fournissait de nouveaux indices quant à la structure et l'organisation des réseaux de distribution ainsi que sur les dimensions du marché. Cette recherche avait aussi un objectif implicite, développer et d'évaluer la mise en place de l'épidémiologie basée sur l'analyse des eaux usées. En particulier, il s'agissait d'acquérir les connaissances nécessaires quant à la manière de collecter, traiter et analyser des échantillons d'eaux usées, mais surtout, de comprendre comment interpréter les données afin d'en extraire les informations les plus pertinentes. Dans la première phase de cette recherche, il y pu être mis en évidence que les charges en ammonium, mesurées directement dans les eaux usées permettait de suivre la dynamique des mouvements de la population contributrice aux eaux usées de la station d'épuration de la zone étudiée. De plus, il a pu être démontré que, sur le long terme, les mouvements de la population n'avaient pas d'influence substantielle sur le pattern de consommation mesuré dans les eaux usées. En se focalisant sur la méthadone, une substance pour laquelle des données précises sur le nombre de prescriptions étaient disponibles, il a pu être démontré que des estimations exactes sur la consommation pouvaient être tirées de l'analyse des eaux usées. Ceci a permis de valider la stratégie d'échantillonnage adoptée, qui, par le bais de la morphine, a ensuite été utilisée pour suivre la consommation d'héroïne. Combinée aux données de vente et de prescription, l'analyse de la morphine a permis d'obtenir des estimations sur la consommation d'héroïne en accord avec des indicateurs conventionnels. Ces résultats, combinés aux données épidémiologiques ont permis de montrer une bonne adéquation entre les projections des deux approches et ainsi démontrer que le chiffre noir des consommateurs qui échappent aux mesures de réduction de risque, et qui ne seraient donc pas mesurés par ces indicateurs, est vraisemblablement limité. La troisième partie du travail a été réalisée dans le cadre d'une étude collaborative qui avait pour but d'investiguer la valeur ajoutée de l'analyse des eaux usées à mettre en évidence des différences géographiques dans la consommation de stupéfiants. En particulier pour des substances stigmatisées, telles la cocaïne et l'héroïne, l'approche a permis d'objectiver et de préciser la vision obtenue avec les indicateurs traditionnels du type sondages ou les statistiques policières. Globalement, l'analyse des eaux usées s'est montrée être un outil très utile pour mieux comprendre le marché des stupéfiants, à la fois sous l'angle épidémiologique et répressif. La quatrième partie du travail était focalisée sur la problématique du cannabis ainsi que sur le potentiel de combiner l'analyse des eaux usées aux données de sondage afin de surmonter, en partie, leurs limitations. En utilisant un modèle d'inférence hiérarchique, il a été possible d'affiner les actuelles estimations sur la prévalence de l'utilisation de cannabis dans la zone métropolitaine de la ville de Lausanne. Les résultats ont démontré que celle-ci est plus haute que ce que l'on s'attendait, confirmant ainsi l'hypothèse que les sondages ont tendance à sous-estimer la consommation de cannabis. Bien que biaisés, les données récoltées par les sondages ont permis de surmonter certaines des limitations liées à l'analyse des marqueurs du cannabis dans les eaux usées (i.e., stabilité et manque de données sur l'excrétion). Ces résultats mettent en évidence l'importance et l'utilité de combiner les résultats de l'analyse des eaux usées aux indicateurs existants. De la même façon, la cinquième partie du travail était centrée sur l'apport de l'analyse des eaux usées du point de vue de la police. Au travers de trois exemples, l'utilisation de l'indicateur pour produire du renseignement concernant la structure et les activités des réseaux de distribution de stupéfiants, ainsi que pour guider les choix stratégiques et opérationnels de la police, a été mise en évidence. Dans la dernière partie, la possibilité d'utiliser cette approche pour suivre la consommation de produits pharmaceutiques dangereux, interdits ou contrefaits, a été démontrée par l'analyse dans les eaux usées de la sibutramine et ses métabolites. Les résultats de cette recherche ont mis en évidence que l'épidémiologie par l'analyse des eaux usées est une approche pertinente et puissante, ayant de nombreux domaines d'application. Face à la complexité de mesurer un phénomène caché comme la consommation de stupéfiants, la valeur ajoutée de cette approche a ainsi pu être démontrée.
Resumo:
The most suitable method for estimation of size diversity is investigated. Size diversity is computed on the basis of the Shannon diversity expression adapted for continuous variables, such as size. It takes the form of an integral involving the probability density function (pdf) of the size of the individuals. Different approaches for the estimation of pdf are compared: parametric methods, assuming that data come from a determinate family of pdfs, and nonparametric methods, where pdf is estimated using some kind of local evaluation. Exponential, generalized Pareto, normal, and log-normal distributions have been used to generate simulated samples using estimated parameters from real samples. Nonparametric methods include discrete computation of data histograms based on size intervals and continuous kernel estimation of pdf. Kernel approach gives accurate estimation of size diversity, whilst parametric methods are only useful when the reference distribution have similar shape to the real one. Special attention is given for data standardization. The division of data by the sample geometric mean is proposedas the most suitable standardization method, which shows additional advantages: the same size diversity value is obtained when using original size or log-transformed data, and size measurements with different dimensionality (longitudes, areas, volumes or biomasses) may be immediately compared with the simple addition of ln k where kis the dimensionality (1, 2, or 3, respectively). Thus, the kernel estimation, after data standardization by division of sample geometric mean, arises as the most reliable and generalizable method of size diversity evaluation
Resumo:
A class of three-sided markets (and games) is considered, where value is generated by pairs or triplets of agents belonging to different sectors, as well as by individuals. For these markets we analyze the situation that arises when some agents leave the market with some payoff To this end, we introduce the derived market (and game) and relate it to the Davis and Maschler (1965) reduced game. Consistency with respect to the derived market, together with singleness best and individual anti-monotonicity axiomatically characterize the core for these generalized three-sided assignment markets. These markets may have an empty core, but we define a balanced subclass, where the worth of each triplet is defined as the addition of the worths of the pairs it contains. Keywords: Multi-sided assignment market, Consistency, Core, Nucleolus. JEL Classification: C71, C78
Resumo:
As a result of the recent regulatory amendments and other development trends in the electricity distribution business, the sector is currently witnessing radical restructuring that will eventually impact the business logics of the sector. This report represents upcoming changes in the electricity distribution industry and concentrates on the factors that are expected to be the most fundamental ones. Electricity network companies nowadays struggle with legislative and regulatory requirements that focus on both the operational efficiency and the reliability of electricity distribution networks. The forces that have an impact on the distribution network companies can be put into three main categories that define the transformation at a general level. Those are: (1) a requirement for a more functional marketplace for energy, (2) environmental aspects (combating climate change etc.), and (3) a strongly emphasized requirement for the security of energy supply. The first point arises from the legislators’ attempt to increase competition in electricity retail markets, the second one concerns both environmental protection and human safety issues, and the third one indicates societies’ reduced willingness to accept interruptions in electricity supply. In the future, regulation of electricity distribution business may lower the threshold for building more weather-resistant networks, which in turn means increased underground cabling. This development pattern is reinforced by tightening safety and environmental regulations that ultimately make the overhead lines expensive to build and maintain. The changes will require new approaches particularly in network planning, construction, and maintenance. The concept for planning, constructing, and maintaining cable networks is necessary because the interdependencies between network operations are strong, in other words, the nature of the operation requires a linkage to other operations.
Resumo:
Both the competitive environment and the internal structure of an industrial organization are typically included in the processes which describe the strategic management processes of the firm, but less attention has been paid to the interdependence between these views. Therefore, this research focuses on explaining the particular conditions of an industry change, which lead managers to realign the firm in respect of its environment for generating competitive advantage. The research question that directs the development of the theoretical framework is: Why do firms outsource some of their functions? The three general stages of the analysis are related to the following research topics: (i) understanding forces that shape the industry, (ii) estimating the impacts of transforming customer preferences, rivalry, and changing capability bases on the relevance of existing assets and activities, and emergence of new business models, and (iii) developing optional structures for future value chains and understanding general boundaries for market emergence. The defined research setting contributes to the managerial research questions “Why do firms reorganize their value chains?”, “Why and how are decisions made?” Combining Transaction Cost Economics (TCE) and Resource-Based View (RBV) within an integrated framework makes it possible to evaluate the two dimensions of a company’s resources, namely the strategic value and transferability. The final decision of restructuring will be made based on an analysis of the actual business potential of the outsourcing, where benefits and risks are evaluated. The firm focuses on the risk of opportunism, hold-up problems, pricing, and opportunities to reach a complete contract, and finally on the direct benefits and risks for financial performance. The supplier analyzes the business potential of an activity outside the specific customer, the amount of customer-specific investments, the service provider’s competitive position, abilities to revenue gains in generic segments, and long-term dependence on the customer.
Resumo:
Some bilingual societies exhibit a distribution of language skills that can- not be explained by economic theories that portray languages as pure commu- nication devices. Such distribution of skills are typically the result of public policies that promote bilingualism among members of both speech commu- nities (reciprocal bilingualism). In this paper I argue that these policies are likely to increase social welfare by diminishing economic and social segmenta- tion between the two communities. However, these gains tend to be unequally distributed over the two communities. As a result, in a large range of circum- stances these policies might not draw su¢ cient support. The model is built upon the communicative value of languages, but also emphasizes the role of linguistic preferences in the behavior of bilingual individuals.
Resumo:
A prospective study of IgG and IgM isotypes of anticardiolipin antibodies (aCL) in a series of 100 patients with systemic lupus erythematosus was carried out. To determine the normal range of both isotype titres a group of 100 normal control serum samples was studied and a log-normal distribution of IgG and IgM isotypes was found. The IgG anticardiolipin antibody serum was regarded as positive if a binding index greater than 2.85 (SD 3.77) was detected and a binding index greater than 4.07 (3.90) was defined as positive for IgM anticardiolipin antibody. Twenty four patients were positive for IgG aCL, 20 for IgM aCL, and 36 for IgG or IgM aCL, or both. IgG aCL were found to have a significant association with thrombosis and thrombocytopenia, and IgM aCL with haemolytic anaemia and neutropenia. Specificity and predictive value for these clinical manifestations increased at moderate and high anticardiolipin antibody titres. In addition, a significant association was found between aCL and the presence of lupus anticoagulant. Identification of these differences in the anticardiolipin antibody isotype associations may improve the clinical usefulness of these tests, and this study confirms the good specificity and predictive value of the anticardiolipin antibody titre for these clinical manifestations.
Resumo:
In recent years, the network vulnerability to natural hazards has been noticed. Moreover, operating on the limits of the network transmission capabilities have resulted in major outages during the past decade. One of the reasons for operating on these limits is that the network has become outdated. Therefore, new technical solutions are studied that could provide more reliable and more energy efficient power distributionand also a better profitability for the network owner. It is the development and price of power electronics that have made the DC distribution an attractive alternative again. In this doctoral thesis, one type of a low-voltage DC distribution system is investigated. Morespecifically, it is studied which current technological solutions, used at the customer-end, could provide better power quality for the customer when compared with the current system. To study the effect of a DC network on the customer-end power quality, a bipolar DC network model is derived. The model can also be used to identify the supply parameters when the V/kW ratio is approximately known. Although the model provides knowledge of the average behavior, it is shown that the instantaneous DC voltage ripple should be limited. The guidelines to choose an appropriate capacitance value for the capacitor located at the input DC terminals of the customer-end are given. Also the structure of the customer-end is considered. A comparison between the most common solutions is made based on their cost, energy efficiency, and reliability. In the comparison, special attention is paid to the passive filtering solutions since the filter is considered a crucial element when the lifetime expenses are determined. It is found out that the filter topology most commonly used today, namely the LC filter, does not provide economical advantage over the hybrid filter structure. Finally, some of the typical control system solutions are introduced and their shortcomings are presented. As a solution to the customer-end voltage regulation problem, an observer-based control scheme is proposed. It is shown how different control system structures affect the performance. The performance meeting the requirements is achieved by using only one output measurement, when operating in a rigid network. Similar performance can be achieved in a weak grid by DC voltage measurement. An additional improvement can be achieved when an adaptive gain scheduling-based control is introduced. As a conclusion, the final power quality is determined by a sum of various factors, and the thesis provides the guidelines for designing the system that improves the power quality experienced by the customer.
Resumo:
The objective of the dissertation is to examine organizational responses of public actors to customer requirements which drive the transformation of value networks and promote public-private partnership in the electricity distribution industry and elderly care sectors. The research bridges the concept of offering to value networks where capabilities can be acquired for novel product concepts. The research contributes to recent literature, re-examining theories on interactions of customer requirements and supply management. A critical realist case study approach is applied to this abductive the research which directs to describe causalities in the analyzed phenomena. The presented evidence is based on three sources, which are in-depth interviews, archival analysis and the Delphi method. Service provision requires awareness on technology and functionalities of offering. Moreover, service provision includes interactions of multiple partners, which suggests the importance of the co-operative orientation of actors. According to the findings,portfolio management has a key role when intelligent solutions are implemented in public service provision because its concepts involve a variety of resources from multiple suppliers. However, emergent networks are not functional if they lack leaders who have access to the customer interface, have power to steer networks and a capability to build offerings. Public procurement policies were recognized to focus on a narrow scope in which price is a key factor in decisions. In the future, the public sector has to implement technology strategies and portfolio management, which mean longterm platform development and commitment to partnerships. On the other hand, the service providers should also be more aware of offerings into which their products will be integrated in the future. This requires making the customer’s voice in product development and co-operation in order to increase the interconnectivity of products.
Resumo:
In this report, we summarize results of our part of the ÄLYKOP-project on customer value creation in the intersection of the health care, ICT, forest and energy industries. The research directs to describe how industry transformation and convergence create new possibilities, business opportunities and even new industries.The report consists of findings which are presented former in academic publications. The publication discusses on customer value, service provision and resource basis of the novel concepts through multiple theorethical frameworks. The report is divided into three maim sections which are theoretical background, discussion on health care industry and evaluations regarding novel smart home concepts. Transaction cost economics and Resource- Based view on the firm provides the theoretical basis to analyze the prescribed phenomena. The health care industry analysis describes the most important changes in the demand conditions of health care services, and explores the features that are likely to open new business opportunities for a solution provider. The third part of the report on the smart home business provides illustrations few potential concepts that can be considered to provide solutions to economical problems which arise from aging of population. The results provide several recommendations for the smart home platform developers in public and private sectors. By the analysis, public organizations dominate service provision and private markets are emergent state at present. We argue that public-private partnerships are nececssary for creating key suppliers. Indeed, paying attion on appropriate regulation, service specifications and technology standards would foster diffusion of new services. The dynamics of the service provision networks is driven by need for new capabiltities which are required for adapting business concepts to new competitive situation. Finally, the smart home framework revealed links between conventionally distant business areas such as health care and energy distribution. The platform integrates functionalities different for purposes which however apply same resource basis.
Resumo:
The objective of this thesis work is to develop and study the Differential Evolution Algorithm for multi-objective optimization with constraints. Differential Evolution is an evolutionary algorithm that has gained in popularity because of its simplicity and good observed performance. Multi-objective evolutionary algorithms have become popular since they are able to produce a set of compromise solutions during the search process to approximate the Pareto-optimal front. The starting point for this thesis was an idea how Differential Evolution, with simple changes, could be extended for optimization with multiple constraints and objectives. This approach is implemented, experimentally studied, and further developed in the work. Development and study concentrates on the multi-objective optimization aspect. The main outcomes of the work are versions of a method called Generalized Differential Evolution. The versions aim to improve the performance of the method in multi-objective optimization. A diversity preservation technique that is effective and efficient compared to previous diversity preservation techniques is developed. The thesis also studies the influence of control parameters of Differential Evolution in multi-objective optimization. Proposals for initial control parameter value selection are given. Overall, the work contributes to the diversity preservation of solutions in multi-objective optimization.
Resumo:
The intent of this research was to develop a model that describes the extent to which customer behavioral intentions are influenced by service quality, customer satisfaction and customer perceived value in the business-to-business service context. Research on customer behavioral intentions is quite fragmented and no generalized model has been presented. Thus, there was need for empirical testing. This study builds on the services marketing theory and assesses the relationships between the identified constructs. The data for the empirical analysis was collected via a quantitative online survey and a total of 226 usable responses were obtained for further analysis. The model was tested in an employment agency service setting. The measures used in this survey were first assessed by using confirmatory factor analysis (CFA) after which the hypothesized relationships were further verified using structural equation modeling (SEM) in LISREL 8.80. The analysis identified that customer satisfaction played a pivotal role in the model as it was the only direct antecedent of customer behavioral intentions, however, customer perceived value showed a strong indirect impact on buying intentions via customer satisfaction. In contrast to what was hypothesized, service quality and customer perceived value did not have a direct positive effect on behavioral intentions. Also, a contradicting finding with current literature was that sacrifice was argued to have a direct but positive impact on customer perceived value. Based on the findings in this study, managers should carefully think of their service strategies that lead to their customers’ favorable behavioral intentions.
Resumo:
Brazil is a country of continental dimension with a population of different ethnic backgrounds. Thus, a wide variation in the frequencies of hepatitis C virus (HCV) genotypes is expected to occur. To address this point, 1,688 sequential samples from chronic HCV patients were analyzed. HCV-RNA was amplified by the RT-PCR from blood samples collected from 1995 to 2000 at different laboratories located in different cities from all Brazilian States. Samples were collected in tubes containing a gel separator, centrifuged in the site of collection and sent by express mail in a refrigerated container to Laboratório Bioquímico Jardim Paulista, São Paulo, SP, Brazil. HCV- RNA was extracted from serum and submitted to RT and nested PCR using standard procedures. Nested PCR products were submitted to cycle sequencing reactions without prior purification. Sequences were analyzed for genotype determination and the following frequencies were found: 64.9% (1,095) for genotype 1, 4.6% (78) for genotype 2, 30.2% (510) for genotype 3, 0.2% (3) for genotype 4, and 0.1% (2) for genotype 5. The frequencies of HCV genotypes were statistically different among Brazilian regions (P = 0.00017). In all regions, genotype 1 was the most frequent (51.7 to 74.1%), reaching the highest value in the North; genotype 2 was more prevalent in the Center-West region (11.4%), especially in Mato Grosso State (25.8%), while genotype 3 was more common in the South (43.2%). Genotypes 4 and 5 were rarely found and only in the Southeast, in São Paulo State. The present data indicate the need for careful epidemiological surveys throughout Brazil since knowing the frequency and distribution of the genotypes would provide key information for understanding the spread of HCV.