952 resultados para Nonhomogeneous initial-boundary-value problems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sudoku problems are some of the most known and enjoyed pastimes, with a never diminishing popularity, but, for the last few years those problems have gone from an entertainment to an interesting research area, a twofold interesting area, in fact. On the one side Sudoku problems, being a variant of Gerechte Designs and Latin Squares, are being actively used for experimental design, as in [8, 44, 39, 9]. On the other hand, Sudoku problems, as simple as they seem, are really hard structured combinatorial search problems, and thanks to their characteristics and behavior, they can be used as benchmark problems for refining and testing solving algorithms and approaches. Also, thanks to their high inner structure, their study can contribute more than studies of random problems to our goal of solving real-world problems and applications and understanding problem characteristics that make them hard to solve. In this work we use two techniques for solving and modeling Sudoku problems, namely, Constraint Satisfaction Problem (CSP) and Satisfiability Problem (SAT) approaches. To this effect we define the Generalized Sudoku Problem (GSP), where regions can be of rectangular shape, problems can be of any order, and solution existence is not guaranteed. With respect to the worst-case complexity, we prove that GSP with block regions of m rows and n columns with m = n is NP-complete. For studying the empirical hardness of GSP, we define a series of instance generators, that differ in the balancing level they guarantee between the constraints of the problem, by finely controlling how the holes are distributed in the cells of the GSP. Experimentally, we show that the more balanced are the constraints, the higher the complexity of solving the GSP instances, and that GSP is harder than the Quasigroup Completion Problem (QCP), a problem generalized by GSP. Finally, we provide a study of the correlation between backbone variables – variables with the same value in all the solutions of an instance– and hardness of GSP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This book explores Russian synthesis that occurred in Russian economic thought between 1890 and 1920. This includes all the attempts at synthesis between classical political economy and marginalism; the labour theory of value and marginal utility; and value and prices. The various ways in which Russian economists have approached these issues have generally been addressed in a piecemeal fashion in history of economic thought literature. This book returns to the primary sources in the Russian language, translating many into English for the first time, and offers the first comprehensive history of the Russian synthesis. The book first examines the origins of the Russian synthesis by determining the condition of reception in Russia of the various theories of value involved: the classical theories of value of Ricardo and Marx on one side; the marginalist theories of prices of Menger, Walras and Jevons on the other. It then reconstructs the three generations of the Russian synthesis: the first (Tugan-Baranovsky), the second, the mathematicians (Dmitriev, Bortkiewicz, Shaposhnikov, Slutsky, etc.) and the last (Yurovsky), with an emphasis on Tugan-Baranovsky's initial impetus. This volume is suitable for those studying economic theory and philosophy as well as those interested in the history of economic thought.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a new family of risk measures, called GlueVaR, within the class of distortion risk measures. Analytical closed-form expressions are shown for the most frequently used distribution functions in financial and insurance applications. The relationship between Glue-VaR, Value-at-Risk (VaR) and Tail Value-at-Risk (TVaR) is explained. Tail-subadditivity is investigated and it is shown that some GlueVaR risk measures satisfy this property. An interpretation in terms of risk attitudes is provided and a discussion is given on the applicability in non-financial problems such as health, safety, environmental or catastrophic risk management

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this thesis is to provide a business model framework that connects customer value to firm resources and explains the change logic of the business model. Strategic supply management and especially dynamic value network management as its scope, the dissertation is based on basic economic theories, transaction cost economics and the resource-based view. The main research question is how the changing customer values should be taken into account when planning business in a networked environment. The main question is divided into questions that form the basic research problems for the separate case studies presented in the five Publications. This research adopts the case study strategy, and the constructive research approach within it. The material consists of data from several Delphi panels and expert workshops, software pilot documents, company financial statements and information on investor relations on the companies’ web sites. The cases used in this study are a mobile multi-player game value network, smart phone and “Skype mobile” services, the business models of AOL, eBay, Google, Amazon and a telecom operator, a virtual city portal business system and a multi-play offering. The main contribution of this dissertation is bridging the gap between firm resources and customer value. This has been done by theorizing the business model concept and connecting it to both the resource-based view and customer value. This thesis contributes to the resource-based view, which deals with customer value and firm resources needed to deliver the value but has a gap in explaining how the customer value changes should be connected to the changes in key resources. This dissertation also provides tools and processes for analyzing the customer value preferences of ICT services, constructing and analyzing business models and business concept innovation and conducting resource analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Le cancer du poumon fait partie des cancers les plus fréquents. Diagnostiqué à un stade souvent tardif, il se caractérise par un mauvais pronostic et de lourdes répercussions sur la santé du patient. La phase de traitement initial représente un moment critique où les proches du patient deviennent aidants naturels du patient et endossent de nouvelles responsabilités. Cette situation génère des conséquences en termes de stress chez l'aidant naturel. Le but de cette étude était de décrire l'importance du stress chez l'aidant naturel du patient en traitement initial pour un cancer du poumon. L'approche méthodologique utilisée est de type descriptif transversal. L'échantillon de convenance était composé de 28 aidants naturels et 26 patients en traitement initial pour un cancer du poumon suivis dans un centre d'oncologie ambulatoire universitaire en Suisse. L'importance du stress des aidants naturels a été évaluée à partir de l'instrument Caregiver Reaction Assessment (CRA) de Given et al. (1992) complété d'une question mesurant la perception du manque d'information. Le CRA mesure le ressenti de l'aidant naturel face aux dimensions négatives et positive du stress sur une échelle de type Likert à cinq points. Le modèle des systèmes de Neuman (2002) a servi de cadre théorique à l'étude. Les aidants naturels ont accordé une importance plus élevée (score moyen 4,15) à l'Estime de soi, dimension positive du stress qu'aux dimensions négatives. Parmi celles-ci, la Perturbation des activités représente la dimension qui affecte le plus le quotidien des aidants (score moyen 2,96). Les caractéristiques sociodémographiques de l'aidant et les données médicales du patient, semblent avoir une influence sur l'importance du stress perçu, mais d'autres marqueurs objectifs doivent être identifiés pour affiner l'interprétation de ces relations. Une grande majorité (78%) des aidants ont indiqué qu'ils disposaient d'assez d'information pour soigner, indiquant que l'information est un sujet important dans leur vécu et mérite une évaluation plus poussée en tant que dimension du stress associé au rôle d'aidant naturel. L'infirmière doit viser par ses actions à préserver l'intégrité et la stabilité de la santé de l'aidant naturel comme elle le fait pour les patients. L'investigation systématique du stress représente une intervention prioritaire chez les aidants naturels dans le contexte oncologique. Des interventions personnalisées pourront être ainsi développées, afin de soutenir l'aidant naturel dans ses activités d'aide auprès de son proche tout en préservant sa propre santé.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS: Proprotein convertase subtilisin kexin 9 (PCSK9) is an emerging target for the treatment of hypercholesterolaemia, but the clinical utility of PCSK9 levels to guide treatment is unknown. We aimed to prospectively assess the prognostic value of plasma PCSK9 levels in patients with acute coronary syndromes (ACS). METHODS AND RESULTS: Plasma PCSK9 levels were measured in 2030 ACS patients undergoing coronary angiography in a Swiss prospective cohort. At 1 year, the association between PCSK9 tertiles and all-cause death was assessed adjusting for the Global Registry of Acute Coronary Events (GRACE) variables, as well as the achievement of LDL cholesterol targets of <1.8 mmol/L. Patients with higher PCSK9 levels at angiography were more likely to have clinical familial hypercholesterolaemia (rate ratio, RR 1.21, 95% confidence interval, CI 1.09-1.53), be treated with lipid-lowering therapy (RR 1.46, 95% CI 1.30-1.63), present with longer time interval of chest pain (RR 1.29, 95% CI 1.09-1.53) and higher C-reactive protein levels (RR 1.22, 95% CI 1.16-1.30). PCSK9 increased 12-24 h after ACS (374 ± 149 vs. 323 ± 134 ng/mL, P < 0.001). At 1 year follow-up, HRs for upper vs. lower PCSK9-level tertiles were 1.13 (95% CI 0.69-1.85) for all-cause death and remained similar after adjustment for the GRACE score. Patients with higher PCSK9 levels were less likely to reach the recommended LDL cholesterol targets (RR 0.81, 95% CI 0.66-0.99). CONCLUSION: In ACS patients, high initial PCSK9 plasma levels were associated with inflammation in the acute phase and hypercholesterolaemia, but did not predict mortality at 1 year.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: While reduction of DUP (Duration of Untreated Psychosis) is a key goal in early intervention strategies, the predictive value of DUP on outcome has been questioned. We planned this study in order to explore the impact of three different definition of "treatment initiation" on the predictive value of DUP on outcome in an early psychosis sample. METHODS: 221 early psychosis patients aged 18-35 were followed-up prospectively over 36 months. DUP was measured using three definitions for treatment onset: Initiation of antipsychotic medication (DUP1); engagement in a specialized programme (DUP2) and combination of engagement in a specialized programme and adherence to medication (DUP3). RESULTS: 10% of patients never reached criteria for DUP3 and therefore were never adequately treated over the 36-month period of care. While DUP1 and DUP2 had a limited predictive value on outcome, DUP3, based on a more restrictive definition for treatment onset, was a better predictor of positive and negative symptoms, as well as functional outcome at 12, 24 and 36 months. Globally, DUP3 explained 2 to 5 times more of the variance than DUP1 and DUP2, with effect sizes falling in the medium range according to Cohen. CONCLUSIONS: The limited predictive value of DUP on outcome in previous studies may be linked to problems of definitions that do not take adherence to treatment into account. While they need replication, our results suggest effort to reduce DUP should continue and aim both at early detection and development of engagement strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation analyses the growing pool of copyrighted works, which are offered to the public using Creative Commons licensing. The study consist of analysis of the novel licensing system, the licensors, and the changes of the "all rights reserved" —paradigm of copyright law. Copyright law reserves all rights to the creator until seventy years have passed since her demise. Many claim that this endangers communal interests. Quite often the creators are willing to release some rights. This, however, is very difficult to do and needs help of specialized lawyers. The study finds that the innovative Creative Commons licensing scheme is well suited for low value - high volume licensing. It helps to reduce transaction costs on several le¬vels. However, CC licensing is not a "silver bullet". Privacy, moral rights, the problems of license interpretation and license compatibility with other open licenses and collecting societies remain unsolved. The study consists of seven chapters. The first chapter introduces the research topic and research questions. The second and third chapters inspect the Creative Commons licensing scheme's technical, economic and legal aspects. The fourth and fifth chapters examine the incentives of the licensors who use open licenses and describe certain open business models. The sixth chapter studies the role of collecting societies and whether two institutions, Creative Commons and collecting societies can coexist. The final chapter summarizes the findings. The dissertation contributes to the existing literature in several ways. There is a wide range of prior research on open source licensing. However, there is an urgent need for an extensive study of the Creative Commons licensing and its actual and potential impact on the creative ecosystem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The article describes some concrete problems that were encountered when writing a two-level model of Mari morphology. Mari is an agglutinative Finno-Ugric language spoken in Russia by about 600 000 people. The work was begun in the 1980s on the basis of K. Koskenniemi’s Two-Level Morphology (1983), but in the latest stage R. Beesley’s and L. Karttunen’s Finite State Morphology (2003) was used. Many of the problems described in the article concern the inexplicitness of the rules in Mari grammars and the lack of information about the exact distribution of some suffixes, e.g. enclitics. The Mari grammars usually give complete paradigms for a few unproblematic verb stems, whereas the difficult or unclear forms of certain verbs are only superficially discussed. Another example of phenomena that are poorly described in grammars is the way suffixes with an initial sibilant combine to stems ending in a sibilant. The help of informants and searches from electronic corpora were used to overcome such difficulties in the development of the two-level model of Mari. The variation of the order of plural markers, case suffixes and possessive suffixes is a typical feature of Mari. The morphotactic rules constructed for Mari declensional forms tend to be recursive and their productivity must be limited by some technical device, such as filters. In the present model, certain plural markers were treated like nouns. The positional and functional versatility of the possessive suffixes can be regarded as the most challenging phenomenon in attempts to formalize the Mari morphology. Cyrillic orthography, which was used in the model, also caused problems. For instance, a Cyrillic letter may represent a sequence of two sounds, the first being part of the word stem while the other belongs to a suffix. In some cases, letters for voiced consonants are also generalized to represent voiceless consonants. Such orthographical conventions distance a morphological model based on orthography from the actual (morpho)phonological processes in the language.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent decades, European educational systems are facing many challenges related to the treatment of cultural and linguistic diversity. The need to address this diversity requires new approaches to education; this in turn requires changes in the way we prepare teachers for the new reality they face in their classrooms. In this article we highlight some of the major problems that initial teacher training has to address in order to enable teachers to deal effectively, respectfully, and fairly with students whose linguistic and cultural background is different from their own. We also present several models for teacher education from Europe and North America based on clearly identified teacher competences for linguistic and cultural diversity

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation considers the segmental durations of speech from the viewpoint of speech technology, especially speech synthesis. The idea is that better models of segmental durations lead to higher naturalness and better intelligibility. These features are the key factors for better usability and generality of synthesized speech technology. Even though the studies are based on a Finnish corpus the approaches apply to all other languages as well. This is possibly due to the fact that most of the studies included in this dissertation are about universal effects taking place on utterance boundaries. Also the methods invented and used here are suitable for any other study of another language. This study is based on two corpora of news reading speech and sentences read aloud. The other corpus is read aloud by a 39-year-old male, whilst the other consists of several speakers in various situations. The use of two corpora is twofold: it involves a comparison of the corpora and a broader view on the matters of interest. The dissertation begins with an overview to the phonemes and the quantity system in the Finnish language. Especially, we are covering the intrinsic durations of phonemes and phoneme categories, as well as the difference of duration between short and long phonemes. The phoneme categories are presented to facilitate the problem of variability of speech segments. In this dissertation we cover the boundary-adjacent effects on segmental durations. In initial positions of utterances we find that there seems to be initial shortening in Finnish, but the result depends on the level of detail and on the individual phoneme. On the phoneme level we find that the shortening or lengthening only affects the very first ones at the beginning of an utterance. However, on average, the effect seems to shorten the whole first word on the word level. We establish the effect of final lengthening in Finnish. The effect in Finnish has been an open question for a long time, whilst Finnish has been the last missing piece for it to be a universal phenomenon. Final lengthening is studied from various angles and it is also shown that it is not a mere effect of prominence or an effect of speech corpus with high inter- and intra-speaker variation. The effect of final lengthening seems to extend from the final to the penultimate word. On a phoneme level it reaches a much wider area than the initial effect. We also present a normalization method suitable for corpus studies on segmental durations. The method uses an utterance-level normalization approach to capture the pattern of segmental durations within each utterance. This prevents the impact of various problematic variations within the corpora. The normalization is used in a study on final lengthening to show that the results on the effect are not caused by variation in the material. The dissertation shows an implementation and prowess of speech synthesis on a mobile platform. We find that the rule-based method of speech synthesis is a real-time software solution, but the signal generation process slows down the system beyond real time. Future aspects of speech synthesis on limited platforms are discussed. The dissertation considers ethical issues on the development of speech technology. The main focus is on the development of speech synthesis with high naturalness, but the problems and solutions are applicable to any other speech technology approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pollution and toxicity problems posed by arsenic in the environment have long been established. Hence, the removal and recovery remedies have been sought, bearing in mind the efficiency, cost effectiveness and environmental friendliness of the methods employed. The sorption kinetics and intraparticulate diffusivity of As (III) bioremediation from aqueous solution using modified and unmodified coconut fiber was investigated. The amount adsorbed increased as time increased, reaching equilibrium at about 60 minutes. The kinetic studies showed that the sorption rates could be described by both pseudo-first order and pseudo-second order process with the later showing a better fit with a value of rate constant of 1.16 x 10-4 min-1 for the three adsorbent types. The mechanism of sorption was found to be particle diffusion controlled. The diffusion and boundary layer effects were also investigation. Therefore, the results show that coconut fiber, both modified and unmodified is an efficient sorbent for the removal of As (III) from industrial effluents with particle diffusion as the predominant mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this report, we summarize results of our part of the ÄLYKOP-project on customer value creation in the intersection of the health care, ICT, forest and energy industries. The research directs to describe how industry transformation and convergence create new possibilities, business opportunities and even new industries.The report consists of findings which are presented former in academic publications. The publication discusses on customer value, service provision and resource basis of the novel concepts through multiple theorethical frameworks. The report is divided into three maim sections which are theoretical background, discussion on health care industry and evaluations regarding novel smart home concepts. Transaction cost economics and Resource- Based view on the firm provides the theoretical basis to analyze the prescribed phenomena. The health care industry analysis describes the most important changes in the demand conditions of health care services, and explores the features that are likely to open new business opportunities for a solution provider. The third part of the report on the smart home business provides illustrations few potential concepts that can be considered to provide solutions to economical problems which arise from aging of population. The results provide several recommendations for the smart home platform developers in public and private sectors. By the analysis, public organizations dominate service provision and private markets are emergent state at present. We argue that public-private partnerships are nececssary for creating key suppliers. Indeed, paying attion on appropriate regulation, service specifications and technology standards would foster diffusion of new services. The dynamics of the service provision networks is driven by need for new capabiltities which are required for adapting business concepts to new competitive situation. Finally, the smart home framework revealed links between conventionally distant business areas such as health care and energy distribution. The platform integrates functionalities different for purposes which however apply same resource basis.