960 resultados para Initial value problems
Resumo:
Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.
Resumo:
Forensic scientists face increasingly complex inference problems for evaluating likelihood ratios (LRs) for an appropriate pair of propositions. Up to now, scientists and statisticians have derived LR formulae using an algebraic approach. However, this approach reaches its limits when addressing cases with an increasing number of variables and dependence relationships between these variables. In this study, we suggest using a graphical approach, based on the construction of Bayesian networks (BNs). We first construct a BN that captures the problem, and then deduce the expression for calculating the LR from this model to compare it with existing LR formulae. We illustrate this idea by applying it to the evaluation of an activity level LR in the context of the two-trace transfer problem. Our approach allows us to relax assumptions made in previous LR developments, produce a new LR formula for the two-trace transfer problem and generalize this scenario to n traces.
Resumo:
In this thesis, I develop analytical models to price the value of supply chain investments under demand uncer¬tainty. This thesis includes three self-contained papers. In the first paper, we investigate the value of lead-time reduction under the risk of sudden and abnormal changes in demand forecasts. We first consider the risk of a complete and permanent loss of demand. We then provide a more general jump-diffusion model, where we add a compound Poisson process to a constant-volatility demand process to explore the impact of sudden changes in demand forecasts on the value of lead-time reduction. We use an Edgeworth series expansion to divide the lead-time cost into that arising from constant instantaneous volatility, and that arising from the risk of jumps. We show that the value of lead-time reduction increases substantially in the intensity and/or the magnitude of jumps. In the second paper, we analyze the value of quantity flexibility in the presence of supply-chain dis- intermediation problems. We use the multiplicative martingale model and the "contracts as reference points" theory to capture both positive and negative effects of quantity flexibility for the downstream level in a supply chain. We show that lead-time reduction reduces both supply-chain disintermediation problems and supply- demand mismatches. We furthermore analyze the impact of the supplier's cost structure on the profitability of quantity-flexibility contracts. When the supplier's initial investment cost is relatively low, supply-chain disin¬termediation risk becomes less important, and hence the contract becomes more profitable for the retailer. We also find that the supply-chain efficiency increases substantially with the supplier's ability to disintermediate the chain when the initial investment cost is relatively high. In the third paper, we investigate the value of dual sourcing for the products with heavy-tailed demand distributions. We apply extreme-value theory and analyze the effects of tail heaviness of demand distribution on the optimal dual-sourcing strategy. We find that the effects of tail heaviness depend on the characteristics of demand and profit parameters. When both the profit margin of the product and the cost differential between the suppliers are relatively high, it is optimal to buffer the mismatch risk by increasing both the inventory level and the responsive capacity as demand uncertainty increases. In that case, however, both the optimal inventory level and the optimal responsive capacity decrease as the tail of demand becomes heavier. When the profit margin of the product is relatively high, and the cost differential between the suppliers is relatively low, it is optimal to buffer the mismatch risk by increasing the responsive capacity and reducing the inventory level as the demand uncertainty increases. In that case, how¬ever, it is optimal to buffer with more inventory and less capacity as the tail of demand becomes heavier. We also show that the optimal responsive capacity is higher for the products with heavier tails when the fill rate is extremely high.
Resumo:
Values and value processes are said to be needed in every organization nowadays, as the world is changing and companies have to have something to "keep it together". Organizational values, which are approvedand used by the personnel, could be the key. Every organization has values. But what is the real value of values? The greatest and most crucial challenge is the feasibility of the value process. The main point in this thesis is tostudy how organizational members at different hierarchical levels perceive values and value processes in their organizations. This includes themes such as how values are disseminated, the targets of value processing, factors that affect the process, problems that occur during the value implementation and improvements that could be made when organizational values are implemented. These subjects are studied from the perspective of organizational members (both managers and employees); individuals in the organizations. The aim is to get the insider-perspective on value processing, from multiple hierarchical levels. In this research I study three different organizations (forest industry, bank and retail cooperative) and their value processes. The data is gathered from companies interviewing personnel in the head office and at the local level. The individuals areseen as members of organizations, and the cultural aspect is topical throughout the whole study. Values and cultures are seen as the 'actuality of reality' of organizations, interpreted by organizational members. The three case companies were chosen because they represented different lines of business and they all implemented value processing differently. Sincethe emphasis in this study is at the local level, the similar size of the local units was also an important factor. Values are in 'fashion' -but what does the fashion tell us about the real corporate practices? In annual reports companies emphasize the importance and power of official values. But what is the real 'point' of values? Values are publicly respected and advertised, but still it seems that the words do not meet the deeds. There is a clear conflict between theoretical, official and substantive organizational values: in the value processing from words to real action. This contradiction in value processing is studied through individual perceptions in this study. I study the kinds of perceptions organizationalmembers have when values are processed from the head office to the local level: the official value process is studied from the individual's perspective. Value management has been studied more during the 1990's. The emphasis has usually been on managers: how they consider the values in organizations and what effects it has on the management. Recent literature has emphasized values as tools for improving company performance. The value implementation as a process has been studied through 'good' and 'bad' examples, as if one successful value process could be copied to all organizations. Each company is different with different cultures and personnel, so no all-powerful way of processing values exists. In this study, the organizational members' perceptions at different hierarchical levels are emphasized. Still, managers are also interviewed; this is done since managerial roles in value dissemination are crucial. Organizational values cannot be well disseminated without management; this has been proved in several earlier studies (e.g. Kunda 1992, Martin 1992, Parker 2000). Recent literature has not sufficiently emphasized the individual's (organizational member's) role in value processing. Organizations consist of differentindividuals with personal values, at all hierarchical levels. The aim in this study is to let the individual take the floor. Very often the value process is described starting from the value definition and ending at dissemination, and the real results are left without attention. I wish to contribute to this area. Values are published officially in annual reports etc. as a 'goal' just like profits. Still, the results/implementationof value processing is rarely followed, at least in official reports. This is a very interesting point: why do companies espouse values, if there is no real control or feedback after the processing? In this study, the personnel in three different companies is asked to give an answer. In the empirical findings, there are several results which bring new aspects to the research area of organizational values. The targets of value processing, factors effecting value processing, the management's roles and the problems in value implementation are presented through the individual's perspective. The individual's perceptions in value processing are a recurring theme throughout the whole study. A comparison between the three companies with diverse value processes makes the research complete
Resumo:
Sudoku problems are some of the most known and enjoyed pastimes, with a never diminishing popularity, but, for the last few years those problems have gone from an entertainment to an interesting research area, a twofold interesting area, in fact. On the one side Sudoku problems, being a variant of Gerechte Designs and Latin Squares, are being actively used for experimental design, as in [8, 44, 39, 9]. On the other hand, Sudoku problems, as simple as they seem, are really hard structured combinatorial search problems, and thanks to their characteristics and behavior, they can be used as benchmark problems for refining and testing solving algorithms and approaches. Also, thanks to their high inner structure, their study can contribute more than studies of random problems to our goal of solving real-world problems and applications and understanding problem characteristics that make them hard to solve. In this work we use two techniques for solving and modeling Sudoku problems, namely, Constraint Satisfaction Problem (CSP) and Satisfiability Problem (SAT) approaches. To this effect we define the Generalized Sudoku Problem (GSP), where regions can be of rectangular shape, problems can be of any order, and solution existence is not guaranteed. With respect to the worst-case complexity, we prove that GSP with block regions of m rows and n columns with m = n is NP-complete. For studying the empirical hardness of GSP, we define a series of instance generators, that differ in the balancing level they guarantee between the constraints of the problem, by finely controlling how the holes are distributed in the cells of the GSP. Experimentally, we show that the more balanced are the constraints, the higher the complexity of solving the GSP instances, and that GSP is harder than the Quasigroup Completion Problem (QCP), a problem generalized by GSP. Finally, we provide a study of the correlation between backbone variables – variables with the same value in all the solutions of an instance– and hardness of GSP.
Resumo:
This book explores Russian synthesis that occurred in Russian economic thought between 1890 and 1920. This includes all the attempts at synthesis between classical political economy and marginalism; the labour theory of value and marginal utility; and value and prices. The various ways in which Russian economists have approached these issues have generally been addressed in a piecemeal fashion in history of economic thought literature. This book returns to the primary sources in the Russian language, translating many into English for the first time, and offers the first comprehensive history of the Russian synthesis. The book first examines the origins of the Russian synthesis by determining the condition of reception in Russia of the various theories of value involved: the classical theories of value of Ricardo and Marx on one side; the marginalist theories of prices of Menger, Walras and Jevons on the other. It then reconstructs the three generations of the Russian synthesis: the first (Tugan-Baranovsky), the second, the mathematicians (Dmitriev, Bortkiewicz, Shaposhnikov, Slutsky, etc.) and the last (Yurovsky), with an emphasis on Tugan-Baranovsky's initial impetus. This volume is suitable for those studying economic theory and philosophy as well as those interested in the history of economic thought.
Resumo:
Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.
Resumo:
Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.
Resumo:
We propose a new family of risk measures, called GlueVaR, within the class of distortion risk measures. Analytical closed-form expressions are shown for the most frequently used distribution functions in financial and insurance applications. The relationship between Glue-VaR, Value-at-Risk (VaR) and Tail Value-at-Risk (TVaR) is explained. Tail-subadditivity is investigated and it is shown that some GlueVaR risk measures satisfy this property. An interpretation in terms of risk attitudes is provided and a discussion is given on the applicability in non-financial problems such as health, safety, environmental or catastrophic risk management
Resumo:
The objective of this thesis is to provide a business model framework that connects customer value to firm resources and explains the change logic of the business model. Strategic supply management and especially dynamic value network management as its scope, the dissertation is based on basic economic theories, transaction cost economics and the resource-based view. The main research question is how the changing customer values should be taken into account when planning business in a networked environment. The main question is divided into questions that form the basic research problems for the separate case studies presented in the five Publications. This research adopts the case study strategy, and the constructive research approach within it. The material consists of data from several Delphi panels and expert workshops, software pilot documents, company financial statements and information on investor relations on the companies’ web sites. The cases used in this study are a mobile multi-player game value network, smart phone and “Skype mobile” services, the business models of AOL, eBay, Google, Amazon and a telecom operator, a virtual city portal business system and a multi-play offering. The main contribution of this dissertation is bridging the gap between firm resources and customer value. This has been done by theorizing the business model concept and connecting it to both the resource-based view and customer value. This thesis contributes to the resource-based view, which deals with customer value and firm resources needed to deliver the value but has a gap in explaining how the customer value changes should be connected to the changes in key resources. This dissertation also provides tools and processes for analyzing the customer value preferences of ICT services, constructing and analyzing business models and business concept innovation and conducting resource analysis.
Resumo:
Le cancer du poumon fait partie des cancers les plus fréquents. Diagnostiqué à un stade souvent tardif, il se caractérise par un mauvais pronostic et de lourdes répercussions sur la santé du patient. La phase de traitement initial représente un moment critique où les proches du patient deviennent aidants naturels du patient et endossent de nouvelles responsabilités. Cette situation génère des conséquences en termes de stress chez l'aidant naturel. Le but de cette étude était de décrire l'importance du stress chez l'aidant naturel du patient en traitement initial pour un cancer du poumon. L'approche méthodologique utilisée est de type descriptif transversal. L'échantillon de convenance était composé de 28 aidants naturels et 26 patients en traitement initial pour un cancer du poumon suivis dans un centre d'oncologie ambulatoire universitaire en Suisse. L'importance du stress des aidants naturels a été évaluée à partir de l'instrument Caregiver Reaction Assessment (CRA) de Given et al. (1992) complété d'une question mesurant la perception du manque d'information. Le CRA mesure le ressenti de l'aidant naturel face aux dimensions négatives et positive du stress sur une échelle de type Likert à cinq points. Le modèle des systèmes de Neuman (2002) a servi de cadre théorique à l'étude. Les aidants naturels ont accordé une importance plus élevée (score moyen 4,15) à l'Estime de soi, dimension positive du stress qu'aux dimensions négatives. Parmi celles-ci, la Perturbation des activités représente la dimension qui affecte le plus le quotidien des aidants (score moyen 2,96). Les caractéristiques sociodémographiques de l'aidant et les données médicales du patient, semblent avoir une influence sur l'importance du stress perçu, mais d'autres marqueurs objectifs doivent être identifiés pour affiner l'interprétation de ces relations. Une grande majorité (78%) des aidants ont indiqué qu'ils disposaient d'assez d'information pour soigner, indiquant que l'information est un sujet important dans leur vécu et mérite une évaluation plus poussée en tant que dimension du stress associé au rôle d'aidant naturel. L'infirmière doit viser par ses actions à préserver l'intégrité et la stabilité de la santé de l'aidant naturel comme elle le fait pour les patients. L'investigation systématique du stress représente une intervention prioritaire chez les aidants naturels dans le contexte oncologique. Des interventions personnalisées pourront être ainsi développées, afin de soutenir l'aidant naturel dans ses activités d'aide auprès de son proche tout en préservant sa propre santé.
Resumo:
AIMS: Proprotein convertase subtilisin kexin 9 (PCSK9) is an emerging target for the treatment of hypercholesterolaemia, but the clinical utility of PCSK9 levels to guide treatment is unknown. We aimed to prospectively assess the prognostic value of plasma PCSK9 levels in patients with acute coronary syndromes (ACS). METHODS AND RESULTS: Plasma PCSK9 levels were measured in 2030 ACS patients undergoing coronary angiography in a Swiss prospective cohort. At 1 year, the association between PCSK9 tertiles and all-cause death was assessed adjusting for the Global Registry of Acute Coronary Events (GRACE) variables, as well as the achievement of LDL cholesterol targets of <1.8 mmol/L. Patients with higher PCSK9 levels at angiography were more likely to have clinical familial hypercholesterolaemia (rate ratio, RR 1.21, 95% confidence interval, CI 1.09-1.53), be treated with lipid-lowering therapy (RR 1.46, 95% CI 1.30-1.63), present with longer time interval of chest pain (RR 1.29, 95% CI 1.09-1.53) and higher C-reactive protein levels (RR 1.22, 95% CI 1.16-1.30). PCSK9 increased 12-24 h after ACS (374 ± 149 vs. 323 ± 134 ng/mL, P < 0.001). At 1 year follow-up, HRs for upper vs. lower PCSK9-level tertiles were 1.13 (95% CI 0.69-1.85) for all-cause death and remained similar after adjustment for the GRACE score. Patients with higher PCSK9 levels were less likely to reach the recommended LDL cholesterol targets (RR 0.81, 95% CI 0.66-0.99). CONCLUSION: In ACS patients, high initial PCSK9 plasma levels were associated with inflammation in the acute phase and hypercholesterolaemia, but did not predict mortality at 1 year.
Resumo:
BACKGROUND: While reduction of DUP (Duration of Untreated Psychosis) is a key goal in early intervention strategies, the predictive value of DUP on outcome has been questioned. We planned this study in order to explore the impact of three different definition of "treatment initiation" on the predictive value of DUP on outcome in an early psychosis sample. METHODS: 221 early psychosis patients aged 18-35 were followed-up prospectively over 36 months. DUP was measured using three definitions for treatment onset: Initiation of antipsychotic medication (DUP1); engagement in a specialized programme (DUP2) and combination of engagement in a specialized programme and adherence to medication (DUP3). RESULTS: 10% of patients never reached criteria for DUP3 and therefore were never adequately treated over the 36-month period of care. While DUP1 and DUP2 had a limited predictive value on outcome, DUP3, based on a more restrictive definition for treatment onset, was a better predictor of positive and negative symptoms, as well as functional outcome at 12, 24 and 36 months. Globally, DUP3 explained 2 to 5 times more of the variance than DUP1 and DUP2, with effect sizes falling in the medium range according to Cohen. CONCLUSIONS: The limited predictive value of DUP on outcome in previous studies may be linked to problems of definitions that do not take adherence to treatment into account. While they need replication, our results suggest effort to reduce DUP should continue and aim both at early detection and development of engagement strategies.
Resumo:
This dissertation analyses the growing pool of copyrighted works, which are offered to the public using Creative Commons licensing. The study consist of analysis of the novel licensing system, the licensors, and the changes of the "all rights reserved" —paradigm of copyright law. Copyright law reserves all rights to the creator until seventy years have passed since her demise. Many claim that this endangers communal interests. Quite often the creators are willing to release some rights. This, however, is very difficult to do and needs help of specialized lawyers. The study finds that the innovative Creative Commons licensing scheme is well suited for low value - high volume licensing. It helps to reduce transaction costs on several le¬vels. However, CC licensing is not a "silver bullet". Privacy, moral rights, the problems of license interpretation and license compatibility with other open licenses and collecting societies remain unsolved. The study consists of seven chapters. The first chapter introduces the research topic and research questions. The second and third chapters inspect the Creative Commons licensing scheme's technical, economic and legal aspects. The fourth and fifth chapters examine the incentives of the licensors who use open licenses and describe certain open business models. The sixth chapter studies the role of collecting societies and whether two institutions, Creative Commons and collecting societies can coexist. The final chapter summarizes the findings. The dissertation contributes to the existing literature in several ways. There is a wide range of prior research on open source licensing. However, there is an urgent need for an extensive study of the Creative Commons licensing and its actual and potential impact on the creative ecosystem.
Resumo:
The article describes some concrete problems that were encountered when writing a two-level model of Mari morphology. Mari is an agglutinative Finno-Ugric language spoken in Russia by about 600 000 people. The work was begun in the 1980s on the basis of K. Koskenniemi’s Two-Level Morphology (1983), but in the latest stage R. Beesley’s and L. Karttunen’s Finite State Morphology (2003) was used. Many of the problems described in the article concern the inexplicitness of the rules in Mari grammars and the lack of information about the exact distribution of some suffixes, e.g. enclitics. The Mari grammars usually give complete paradigms for a few unproblematic verb stems, whereas the difficult or unclear forms of certain verbs are only superficially discussed. Another example of phenomena that are poorly described in grammars is the way suffixes with an initial sibilant combine to stems ending in a sibilant. The help of informants and searches from electronic corpora were used to overcome such difficulties in the development of the two-level model of Mari. The variation of the order of plural markers, case suffixes and possessive suffixes is a typical feature of Mari. The morphotactic rules constructed for Mari declensional forms tend to be recursive and their productivity must be limited by some technical device, such as filters. In the present model, certain plural markers were treated like nouns. The positional and functional versatility of the possessive suffixes can be regarded as the most challenging phenomenon in attempts to formalize the Mari morphology. Cyrillic orthography, which was used in the model, also caused problems. For instance, a Cyrillic letter may represent a sequence of two sounds, the first being part of the word stem while the other belongs to a suffix. In some cases, letters for voiced consonants are also generalized to represent voiceless consonants. Such orthographical conventions distance a morphological model based on orthography from the actual (morpho)phonological processes in the language.