893 resultados para Problem gambling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The Palliative Care Problem Severity Score is a clinician-rated tool to assess problem severity in four palliative care domains (pain, other symptoms, psychological/spiritual, family/carer problems) using a 4-point categorical scale (absent, mild, moderate, severe). Aim To test the reliability and acceptability of the Palliative Care Problem Severity Score. Design: Multi-centre, cross-sectional study involving pairs of clinicians independently rating problem severity using the tool. Setting/participants Clinicians from 10 Australian palliative care services: 9 inpatient units and 1 mixed inpatient/community-based service. Results A total of 102 clinicians participated, with almost 600 paired assessments completed for each domain, involving 420 patients. A total of 91% of paired assessments were undertaken within 2 h. Strength of agreement for three of the four domains was moderate: pain (Kappa = 0.42, 95% confidence interval = 0.36 to 0.49); psychological/spiritual (Kappa = 0.48, 95% confidence interval = 0.42 to 0.54); family/carer (Kappa = 0.45, 95% confidence interval = 0.40 to 0.52). Strength of agreement for the remaining domain (other symptoms) was fair (Kappa = 0.38, 95% confidence interval = 0.32 to 0.45). Conclusion The Palliative Care Problem Severity Score is an acceptable measure, with moderate reliability across three domains. Variability in inter-rater reliability across sites and participant feedback indicate that ongoing education is required to ensure that clinicians understand the purpose of the tool and each of its domains. Raters familiar with the patient they were assessing found it easier to assign problem severity, but this did not improve inter-rater reliability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new theory of shock dynamics has been developed in the form of a finite number of compatibility conditions along shock rays. It has been used to study the growth or decay of shock strength for accelerating or decelerating piston starting with a nonzero piston velocity. The results show good agreement with those obtained by Harten's high resolution TVD scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we first recast the generalized symmetric eigenvalue problem, where the underlying matrix pencil consists of symmetric positive definite matrices, into an unconstrained minimization problem by constructing an appropriate cost function, We then extend it to the case of multiple eigenvectors using an inflation technique, Based on this asymptotic formulation, we derive a quasi-Newton-based adaptive algorithm for estimating the required generalized eigenvectors in the data case. The resulting algorithm is modular and parallel, and it is globally convergent with probability one, We also analyze the effect of inexact inflation on the convergence of this algorithm and that of inexact knowledge of one of the matrices (in the pencil) on the resulting eigenstructure. Simulation results demonstrate that the performance of this algorithm is almost identical to that of the rank-one updating algorithm of Karasalo. Further, the performance of the proposed algorithm has been found to remain stable even over 1 million updates without suffering from any error accumulation problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compulsators are power sources of choice for use in electromagnetic launchers and railguns. These devices hold the promise of reducing unit costs of payload to orbit. In an earlier work, the author had calculated the current distribution in compulsator wires by considering the wire to be split into a finite number of separate wires. The present work develops an integral formulation of the problem of current distribution in compulsator wires which leads to an integrodifferential equation. Analytical solutions, including those for the integration constants, are obtained in closed form. The analytical solutions present a much clearer picture of the effect of various input parameters on the cross-sectional current distribution and point to ways in which the desired current density distribution can be achieved. Results are graphically presented and discussed, with particular reference to a 50-kJ compulsator in Bangalore. Finite-element analysis supports the results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computation of the dependency basis is the fundamental step in solving the implication problem for MVDs in relational database theory. We examine this problem from an algebraic perspective. We introduce the notion of the inference basis of a set M of MVDs and show that it contains the maximum information about the logical consequences of M. We propose the notion of an MVD-lattice and develop an algebraic characterization of the inference basis using simple notions from lattice theory. We also establish several properties of MVD-lattices related to the implication problem. Founded on our characterization, we synthesize efficient algorithms for (a) computing the inference basis of a given set M of MVDs; (b) computing the dependency basis of a given attribute set w.r.t. M; and (c) solving the implication problem for MVDs. Finally, we show that our results naturally extend to incorporate FDs also in a way that enables the solution of the implication problem for both FDs and MVDs put together.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Formulation of quantum first passage problem is attempted in terms of a restricted Feynman path integral that simulates an absorbing barrier as in the corresponding classical case. The positivity of the resulting probability density, however, remains to be demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A direct and simple approach, utilizing Watson's lemma, is presented for obtaining an approximate solution of a three-part Wiener-Hopf problem associated with the problem of diffraction of a plane wave by a soft strip.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study I consider what kind of perspective on the mind body problem is taken and can be taken by a philosophical position called non-reductive physicalism. Many positions fall under this label. The form of non-reductive physicalism which I discuss is in essential respects the position taken by Donald Davidson (1917-2003) and Georg Henrik von Wright (1916-2003). I defend their positions and discuss the unrecognized similarities between their views. Non-reductive physicalism combines two theses: (a) Everything that exists is physical; (b) Mental phenomena cannot be reduced to the states of the brain. This means that according to non-reductive physicalism the mental aspect of humans (be it a soul, mind, or spirit) is an irreducible part of the human condition. Also Davidson and von Wright claim that, in some important sense, the mental aspect of a human being does not reduce to the physical aspect, that there is a gap between these aspects that cannot be closed. I claim that their arguments for this conclusion are convincing. I also argue that whereas von Wright and Davidson give interesting arguments for the irreducibility of the mental, their physicalism is unwarranted. These philosophers do not give good reasons for believing that reality is thoroughly physical. Notwithstanding the materialistic consensus in the contemporary philosophy of mind the ontology of mind is still an uncharted territory where real breakthroughs are not to be expected until a radically new ontological position is developed. The third main claim of this work is that the problem of mental causation cannot be solved from the Davidsonian - von Wrightian perspective. The problem of mental causation is the problem of how mental phenomena like beliefs can cause physical movements of the body. As I see it, the essential point of non-reductive physicalism - the irreducibility of the mental - and the problem of mental causation are closely related. If mental phenomena do not reduce to causally effective states of the brain, then what justifies the belief that mental phenomena have causal powers? If mental causes do not reduce to physical causes, then how to tell when - or whether - the mental causes in terms of which human actions are explained are actually effective? I argue that this - how to decide when mental causes really are effective - is the real problem of mental causation. The motivation to explore and defend a non-reductive position stems from the belief that reductive physicalism leads to serious ethical problems. My claim is that Davidson's and von Wright's ultimate reason to defend a non-reductive view comes back to their belief that a reductive understanding of human nature would be a narrow and possibly harmful perspective. The final conclusion of my thesis is that von Wright's and Davidson's positions provide a starting point from which the current scientistic philosophy of mind can be critically further explored in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider the bi-criteria single machine scheduling problem of n jobs with a learning effect. The two objectives considered are the total completion time (TC) and total absolute differences in completion times (TADC). The objective is to find a sequence that performs well with respect to both the objectives: the total completion time and the total absolute differences in completion times. In an earlier study, a method of solving bi-criteria transportation problem is presented. In this paper, we use the methodology of solvin bi-criteria transportation problem, to our bi-criteria single machine scheduling problem with a learning effect, and obtain the set of optimal sequences,. Numerical examples are presented for illustrating the applicability and ease of understanding.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Design embraces several disciplines dedicated to the production of artifacts and services. These disciplines are quite independent and only recently has psychological interest focused on them. Nowadays, the psychological theories of design, also called design cognition literature, describe the design process from the information processing viewpoint. These models co-exist with the normative standards of how designs should be crafted. In many places there are concrete discrepancies between these two in a way that resembles the differences between the actual and ideal decision-making. This study aimed to explore the possible difference related to problem decomposition. Decomposition is a standard component of human problem-solving models and is also included in the normative models of design. The idea of decomposition is to focus on a single aspect of the problem at a time. Despite its significance, the nature of decomposition in conceptual design is poorly understood and has only been preliminary investigated. This study addressed the status of decomposition in conceptual design of products using protocol analysis. Previous empirical investigations have argued that there are implicit and explicit decomposition, but have not provided a theoretical basis for these two. Therefore, the current research began by reviewing the problem solving and design literature and then composing a cognitive model of the solution search of conceptual design. The result is a synthetic view which describes recognition and decomposition as the basic schemata for conceptual design. A psychological experiment was conducted to explore decomposition. In the test, sixteen (N=16) senior students of mechanical engineering created concepts for two alternative tasks. The concurrent think-aloud method and protocol analysis were used to study decomposition. The results showed that despite the emphasis on decomposition in the formal education, only few designers (N=3) used decomposition explicitly and spontaneously in the presented tasks, although the designers in general applied a top-down control strategy. Instead, inferring from the use of structured strategies, the designers always relied on implicit decomposition. These results confirm the initial observations found in the literature, but they also suggest that decomposition should be investigated further. In the future, the benefits and possibilities of explicit decomposition should be considered along with the cognitive mechanisms behind decomposition. After that, the current results could be reinterpreted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report is the third volume in ILAB’s international child labor series. It focuses on the use of child labor in the production of apparel for the U.S. market, and reviews the extent to which U.S. apparel importers have established and are implementing codes of conduct or other business guidelines prohibiting the use of child labor in the production of the clothing they sell. The report was mandated by the Omnibus Consolidated Rescissions and Appropriations Act of 1996, P.L. 104-134.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Premature or abnormal softening of persimmon fruit within 3-7 days after harvest is a major physiological problem of non-astringent persimmon cultivars grown in subtropical regions of Australia. Up to 30% of consignments may soften rapidly frequently overnight, often resulting in the flesh becoming very soft, completely translucent, and impossible to handle. Incidence of premature soft fruit can vary with season and production location. To study the incidence of this problem, we conducted surveys of fruit harvested from five environmentally-diverse regions of Australia over a two-year period. We found wide variation in the rate of both premature softening and normal softening with differences of up 37 days between orchards in percentage of fruit reaching 50% soft. We found that the rate of fruit softening was exacerbated by lower calcium concentrations at fruit set, shorter fruit development periods and heavier rainfall during the fruit development period. The implications of our findings, in terms of orchard management, export and domestic marketing strategies are discussed.