950 resultados para Initial Value Problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a new solution concept to address the problem of sharing a surplus among the agents generating it. The problem is formulated in the preferences-endowments space. The solution is defined recursively, incorporating notions of consistency and fairness and relying on properties satisfied by the Shapley value for Transferable Utility (TU) games. We show a solution exists, and call it the Ordinal Shapley value (OSV). We characterize the OSV using the notion of coalitional dividends, and furthermore show it is monotone and anonymous. Finally, similarly to the weighted Shapely value for TU games, we construct a weighted OSV as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among PET radiotracers, FDG seems to be quite accepted as an accurate oncology diagnostic tool, frequently helpful also in the evaluation of treatment response and in radiation therapy treatment planning for several cancer sites. To the contrary, the reliability of Choline as a tracer for prostate cancer (PC) still remains an object of debate for clinicians, including radiation oncologists. This review focuses on the available data about the potential impact of Choline-PET in the daily clinical practice of radiation oncologists managing PC patients. In summary, routine Choline-PET is not indicated for initial local T staging, but it seems better than conventional imaging for nodal staging and for all patients with suspected metastases. In these settings, Choline-PET showed the potential to change patient management. A critical limit remains spatial resolution, limiting the accuracy and reliability for small lesions. After a PSA rise, the problem of the trigger PSA value remains crucial. Indeed, the overall detection rate of Choline-PET is significantly increased when the trigger PSA, or the doubling time, increases, but higher PSA levels are often a sign of metastatic spread, a contraindication for potentially curable local treatments such as radiation therapy. Even if several published data seem to be promising, the current role of PET in treatment planning in PC patients to be irradiated still remains under investigation. Based on available literature data, all these issues are addressed and discussed in this review.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of multi-region environmental input-output (IO) analysis to the problem of accounting for emissions generation (and/or resource use) under different accounting principles has become increasingly common in the ecological and environmental economics literature in particular, with applications at the international and interregional subnational level. However, while environmental IO analysis is invaluable in accounting for pollution flows in the single time period that the accounts relate to, it is limited when the focus is on modelling the impacts of any marginal change in activity. This is because a conventional demand-driven IO model assumes an entirely passive supply-side in the economy (i.e. all supply is infinitely elastic) and is further restricted by the assumption of universal Leontief (fixed proportions) technology implied by the use of the A and multiplier matrices. Where analysis of marginal changes in activity is required, extension from an IO accounting framework to a more flexible interregional computable general equilibrium (CGE) approach, where behavioural relationships can be modelled in a more realistic and theory-consistent manner, is appropriate. Our argument is illustrated by comparing the results of introducing a positive demand stimulus in the UK economy using IO and CGE interregional models of Scotland and the rest of the UK. In the case of the latter, we demonstrate how more theory consistent modelling of both demand and supply side behaviour at the regional and national levels effect model results, including the impact on the interregional CO2 ‘trade balance’.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Many emergency department (ED) providers do not follow guideline recommendations for the use of the pneumonia severity index (PSI) to determine the initial site of treatment for patients with community-acquired pneumonia (CAP). We identified the reasons why ED providers hospitalize low-risk patients or manage higher-risk patients as outpatients. METHODS: As a part of a trial to implement a PSI-based guideline for the initial site of treatment of patients with CAP, we analyzed data for patients managed at 12 EDs allocated to a high-intensity guideline implementation strategy study arm. The guideline recommended outpatient care for low-risk patients (nonhypoxemic patients with a PSI risk classification of I, II, or III) and hospitalization for higher-risk patients (hypoxemic patients or patients with a PSI risk classification of IV or V). We asked providers who made guideline-discordant decisions on site of treatment to detail the reasons for nonadherence to guideline recommendations. RESULTS: There were 1,306 patients with CAP (689 low-risk patients and 617 higher-risk patients). Among these patients, physicians admitted 258 (37.4%) of 689 low-risk patients and treated 20 (3.2%) of 617 higher-risk patients as outpatients. The most commonly reported reasons for admitting low-risk patients were the presence of a comorbid illness (178 [71.5%] of 249 patients); a laboratory value, vital sign, or symptom that precluded ED discharge (73 patients [29.3%]); or a recommendation from a primary care or a consulting physician (48 patients [19.3%]). Higher-risk patients were most often treated as outpatients because of a recommendation by a primary care or consulting physician (6 [40.0%] of 15 patients). CONCLUSION: ED providers hospitalize many low-risk patients with CAP, most frequently for a comorbid illness. Although higher-risk patients are infrequently treated as outpatients, this decision is often based on the request of an involved physician.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The prevalence of adolescent obesity has increased considerably over the past decade in Switzerland and has become a serious public health problem in Europe. Prevention of obesity using various comprehensive programmes appears to be very promising, although we must admit that several interventions had generally disappointing results compared with the objectives and target initially fixed. Holistic programmes including nutritional education combined with promotion of physical activity and behaviour modification constitute the key factors in the prevention of childhood and adolescent obesity. The purpose of this programme was to incorporate nutrition/physical education as well as psychological aspects in selected secondary schools (9th grade, 14-17 years). METHODS: The educational strategy was based on the development of a series of 13 practical workshops covering wide areas such as physical inactivity, body composition, sugar, energy density, invisible lipids, how to read food labels, is meal duration important? Do you eat with pleasure or not? Do you eat because you are hungry? Emotional eating. For teachers continuing education, a basic highly illustrated guide was developed as a companion booklet to the workshops. These materials were first validated by biology, physical education, dietician and psychologist teachers as well as school medical officers. RESULTS: Teachers considered the practical educational materials innovative and useful, motivational and easy to understand. Up to now (early 2008), the programme has been implemented in 50 classes or more from schools originating from three areas in the French part of Switzerland. Based on the 1-week pedometer value assessed before and after the 1 school-year programme, an initial evaluation indicated that overall physical placidity was significantly decreased as evidenced by a significant rise in the number of steps per day. CONCLUSION: Future evaluation will provide more information on the effectiveness of the ADOS programme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce and analyze two new semi-discrete numerical methods for the multi-dimensional Vlasov-Poisson system. The schemes are constructed by combing a discontinuous Galerkin approximation to the Vlasov equation together with a mixed finite element method for the Poisson problem. We show optimal error estimates in the case of smooth compactly supported initial data. We propose a scheme that preserves the total energy of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Admiral, a new microporous membrane oxygenator with a low surface area, decreased priming volume and two separate reservoirs, was tested in 30 adult patients. This study was undertaken to evaluate blood path resistance, gas exchange capabilities and blood trauma in clinical use, with and without shed blood separation. Patients were divided into 3 groups. Group 1 had valve surgery without separation of suction, Group 2 had coronary artery bypass grafting (CABG) with direct blood aspiration and Group 3 had coronary artery bypass grafting with shed blood separation. The suctioned, separated, cardiotomy blood in Group 3 was treated with an autotransfusion device at the end of bypass before being returned to the patient. Theoretical blood flow could be achieved in all cases without problem. The pressure drop through the oxygenator averaged 88 +/- 13 mmHg at 4 l/min and 109 +/- 12 mmHg at 5 l/min. O(2) transfer was 163 +/- 27 ml/min. Free plasma haemoglobin rose in all groups, but significantly less in group 3. Lactate dehydrogenase (LDH) rose significantly in Groups 1 and 2. Platelets decreased in all groups without significant differences. Clinical experience with this new oxygenator was safe, the reduced membrane surface did not impair gas exchange and blood trauma could be minimized easily by separating shed blood, using the second cardiotomy reservoir.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The 'database search problem', that is, the strengthening of a case - in terms of probative value - against an individual who is found as a result of a database search, has been approached during the last two decades with substantial mathematical analyses, accompanied by lively debate and centrally opposing conclusions. This represents a challenging obstacle in teaching but also hinders a balanced and coherent discussion of the topic within the wider scientific and legal community. This paper revisits and tracks the associated mathematical analyses in terms of Bayesian networks. Their derivation and discussion for capturing probabilistic arguments that explain the database search problem are outlined in detail. The resulting Bayesian networks offer a distinct view on the main debated issues, along with further clarity. Methods As a general framework for representing and analyzing formal arguments in probabilistic reasoning about uncertain target propositions (that is, whether or not a given individual is the source of a crime stain), this paper relies on graphical probability models, in particular, Bayesian networks. This graphical probability modeling approach is used to capture, within a single model, a series of key variables, such as the number of individuals in a database, the size of the population of potential crime stain sources, and the rarity of the corresponding analytical characteristics in a relevant population. Results This paper demonstrates the feasibility of deriving Bayesian network structures for analyzing, representing, and tracking the database search problem. The output of the proposed models can be shown to agree with existing but exclusively formulaic approaches. Conclusions The proposed Bayesian networks allow one to capture and analyze the currently most well-supported but reputedly counter-intuitive and difficult solution to the database search problem in a way that goes beyond the traditional, purely formulaic expressions. The method's graphical environment, along with its computational and probabilistic architectures, represents a rich package that offers analysts and discussants with additional modes of interaction, concise representation, and coherent communication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Cardiac Troponin-I (cTnI) is a well-recognized early postoperative marker for myocardial damage in adults and children after heart surgery. The present study was undertaken to evaluate whether the integrated value (area under the curve(AUC)) of postoperative cTnI is a better mode to predict long-term outcome than post operative cTnI maximum value, after surgery for congenital heart defects (CHD). Methods: retrospective cohort study. 279 patients (mean age 4.6 years; range 0-17 years-old, 185 males) with congenital heart defect repair on cardiopulmonary by-pass were retrieved from our database including postoperative cTnI values. Maximal post operative cTnI value, post operative cTnI AUC value at 48h and total post operative cTnI AUC value were calculated and then correlated with duration of intubation, duration of ICU stay and mortality. Results: the mean duration of mechanical ventilation was 5.1+/-7.2 days and mean duration of ICU stay was 11.0+/- 13.3 days,11 patients (3.9%) died in post operative period. When comparing survivor and deceased groups, there was a significant difference in the mean value for max cTnI (16.7+/- 21.8 vs 59.2+/-41.4 mcg/l, p+0.0001), 48h AUC cTnI (82.0+/-110.7 vs 268.8+/-497.7 mcg/l, p+0.0001) and total AUC cTnI (623.8+/-1216.7 vs 2564+/-2826.0, p+0.0001). Analyses for duration of mechanical ventilation and duration of ICU stay by linear regression demonstrated a better correlation for 48h AUC cTnI (ventilation time r+0.82, p+0.0001 and ICU stay r+0.74, p+0.0001) then total AUC cTnI (ventilation time r+0.65, p+0.0001 and ICU stay r+0.60, p+0.0001) and max cTnI (ventilation time r+0.64, p+0.0001 and ICU stay r+0.60, p+0.0001). Conclusion: Cardiac Troponin I is a specific and sensitive marker of myocardial injury after congenital heart surgery and it may predict early in-hospital outcomes. Integration of post operative value of cTnI by calculation of AUC improves prediction of early in-hospital outcomes. It probably takes into account, not only the initial surgical procedure, but probably also incorporates the occurrence of hypoxic-ischemic phenomena in the post-operative period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The repeated presentation of simple objects as well as biologically salient objects can cause the adaptation of behavioral and neural responses during the visual categorization of these objects. Mechanisms of response adaptation during repeated food viewing are of particular interest for better understanding food intake beyond energetic needs. Here, we measured visual evoked potentials (VEPs) and conducted neural source estimations to initial and repeated presentations of high-energy and low-energy foods as well as non-food images. The results of our study show that the behavioral and neural responses to food and food-related objects are not uniformly affected by repetition. While the repetition of images displaying low-energy foods and non-food modulated VEPs as well as their underlying neural sources and increased behavioral categorization accuracy, the responses to high-energy images remained largely invariant between initial and repeated encounters. Brain mechanisms when viewing images of high-energy foods thus appear less susceptible to repetition effects than responses to low-energy and non-food images. This finding is likely related to the superior reward value of high-energy foods and might be one reason why in particular high-energetic foods are indulged although potentially leading to detrimental health consequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

S100B is a prognostic factor for melanoma as elevated levels correlate with disease progression and poor outcome. We determined its prognostic value based on updated information using serial determinations in stage IIb/III melanoma patients. 211 Patients who participated in the EORTC 18952 trial, evaluating efficacy of adjuvant intermediate doses of interferon α2b (IFN) versus observation, entered a corollary study. Over a period of 36 months, 918 serum samples were collected. The Cox time-dependent model was used to assess prognostic value of the latest (most recent) S100B determination. At first measurement, 178 patients had S100B values <0.2 μg/l and 33 ≥ 0.2 μg/l. Within the first group, 61 patients had, later on, an increased value of S100B (≥ 0.2 μg/l). An initial increased value of S100B, or during follow-up, was associated with worse distant metastasis-free survival (DMFS); hazard ratio (HR) of S100B ≥ 0.2 versus S100B < 0.2 was 5.57 (95% confidence interval (CI) 3.81-8.16), P < 0.0001, after adjustment for stage, number of lymph nodes and sex. In stage IIb patients, the HR adjusted for sex was 2.14 (95% CI 0.71, 6.42), whereas in stage III, the HR adjusted for stage, number of lymph nodes and sex was 6.76 (95% CI 4.50-10.16). Similar results were observed regarding overall survival (OS). Serial determination of S100B in stage IIb-III melanoma is a strong independent prognostic marker, even stronger compared to stage and number of positive lymph nodes. The prognostic impact of S100B ≥ 0.2 μg/l is more pronounced in stage III disease compared with stage IIb.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A detailed investigation has been conducted on core samples taken from 17 portland cement concrete pavements located in Iowa. The goal of the investigation was to help to clarify the root cause of the premature deterioration problem that has become evident since the early 1990s. Laboratory experiments were also conducted to evaluate how cement composition, mixing time, and admixtures could have influenced the occurrence of premature deterioration. The cements used in this study were selected in an attempt to cover the main compositional parameters pertinent to the construction industry in Iowa. The hardened air content determinations conducted during this study indicated that the pavements that exhibited premature deterioration often contained poor to marginal entrained-air void systems. In addition, petrographic studies indicated that sometimes the entrained-air void system had been marginal after mixing and placement of the pavement slab, while in other instances a marginal to adequate entrained-air void system had been filled with ettringite. The filling was most probably accelerated because of shrinkage cracking at the surface of the concrete pavements. The results of this study suggest that the durability—more sciecifically, the frost resistance—of the concrete pavements should be less than anticipated during the design stage of the pavements. Construction practices played a significant role in the premature deterioration problem. The pavements that exhibited premature distress also exhibited features that suggested poor mixing and poor control of aggregate grading. Segregation was very common in the cores extracted from the pavements that exhibited premature distress. This suggests that the vibrators on the paver were used to overcome a workability problem. Entrained-air voids formed in concrete mixtures experiencing these types of problems normally tend to be extremely coarse, and hence they can easily be lost during the paving process. This tends to leave the pavement with a low air content and a poor distribution of air voids. All of these features were consistent with a premature stiffening problem that drastically influenced the ability of the contractor to place the concrete mixture. Laboratory studies conducted during this project indicated that most premature stiffening problems can be directly attributed to the portland cement used on the project. The admixtures (class C fly ash and water reducer) tended to have only a minor influence on the premature stiffening problem when they were used at the dosage rates described in this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The forensic two-trace problem is a perplexing inference problem introduced by Evett (J Forensic Sci Soc 27:375-381, 1987). Different possible ways of wording the competing pair of propositions (i.e., one proposition advanced by the prosecution and one proposition advanced by the defence) led to different quantifications of the value of the evidence (Meester and Sjerps in Biometrics 59:727-732, 2003). Here, we re-examine this scenario with the aim of clarifying the interrelationships that exist between the different solutions, and in this way, produce a global vision of the problem. We propose to investigate the different expressions for evaluating the value of the evidence by using a graphical approach, i.e. Bayesian networks, to model the rationale behind each of the proposed solutions and the assumptions made on the unknown parameters in this problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The subject "Value and prices in Russian economic thought (1890--1920)" should evoke several names and debates in the reader's mind. For a long time, Western scholars have been aware that the Russian economists Tugan-Baranovsky and Bortkiewicz were active participants to the Marxian transformation problem, that the mathematical models of Dmitriev prefigured forthcoming neoricardian based models, and that many Russian economists were either supporting the Marxian labour theory of value or being revisionists. Moreover, these ideas were preparing the ground for Soviet planning. Russian scholars additionally knew that this period was the time of introduction of marginalism in Russia, and that, during this period, economists were active in thinking the relation of ethics with economic theory. All these issues are well covered in the existing literature. But there is a big gap that this dissertation intends to fill. The existing literature handles these pieces separately, although they are part of a single, more general, history. All these issues (the labour theory of value, marginalism, the Marxian transformation problem, planning, ethics, mathematical economics) were part of what this dissertation calls here "The Russian synthesis". The Russian synthesis (in the singular) designates here all the attempts at synthesis between classical political economy and marginalism, between labour theory of value and marginal utility, and between value and prices that occurred in Russian economic thought between 1890 and 1920, and that embraces the whole set of issues evoked above. This dissertation has the ambition of being the first comprehensive history of that Russian synthesis. In this, this contribution is unique. It has always surprised the author of the present dissertation that such a book has not yet been written. Several good reasons, both in terms of scarce availability of sources and of ideological restrictions, may accounted for a reasonable delay of several decades. But it is now urgent to remedy the situation before the protagonists of the Russian synthesis are definitely classified under the wrong labels in the pantheon of economic thought. To accomplish this task, it has seldom be sufficient to gather together the various existing studies on aspects of this story. It as been necessary to return to the primary sources in the Russian language. The most important part of the primary literature has never been translated, and in the last years only some of them have been republished in Russian. Therefore, most translations from the Russian have been made by the author of the present dissertation. The secondary literature has been surveyed in the languages that are familiar (Russian, English and French) or almost familiar (German) to the present author, and which are hopefully the most pertinent to the present investigation. Besides, and in order to increase the acquaintance with the text, which was the objective of all this, some archival sources were used. The analysis consists of careful chronological studies of the authors' writings and their evolution in their historical and intellectual context. As a consequence, the dissertation brings new authors to the foreground - Shaposhnikov and Yurovsky - who were traditionally confined to the substitutes' bench, because they only superficially touched the domains quoted above. In the Russian synthesis however, they played an important part of the story. As a side effect, some authors that used to play in the foreground - Dmitriev and Bortkiewicz - are relegated to the background, but are not forgotten. Besides, the dissertation refreshes the views on authors already known, such as Ziber and, especially, Tugan-Baranovsky. The ultimate objective of this dissertation is to change the opinion that one could have on "value and prices in Russian economic thought", by setting the Russian synthesis at the centre of the debates.