909 resultados para External Constraint


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radiation therapy has been used as an effective treatment for malignancies in pediatric patients. However, in many cases, the side effects of radiation diminish these patients’ quality of life. In order to develop strategies to minimize radiogenic complications, one must first quantitatively estimate pediatric patients’ relative risk for radiogenic late effects, which has not become feasible till recently because of the calculational complexity. The goals of this work were to calculate the dose delivered to tissues and organs in pediatric patients during contemporary photon and proton radiotherapies; to estimate the corresponding risk of radiogenic second cancer and cardiac toxicity based on the calculated doses and on dose-risk models from the literature; to test for the statistical significance of the difference between predicted risks after photon versus proton radiotherapies; and to provide a prototype of an evidence-based approach to selecting treatment modalities for pediatric patients, taking second cancer and cardiac toxicity into account. The results showed that proton therapy confers a lower predicted risk of radiogenic second cancer, and lower risks of radiogenic cardiac toxicities, compared to photon therapy. An uncertainty analysis revealed that the qualitative findings of this study are insensitive to changes in a wide variety of host and treatment related factors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Standardization is a common method for adjusting confounding factors when comparing two or more exposure category to assess excess risk. Arbitrary choice of standard population in standardization introduces selection bias due to healthy worker effect. Small sample in specific groups also poses problems in estimating relative risk and the statistical significance is problematic. As an alternative, statistical models were proposed to overcome such limitations and find adjusted rates. In this dissertation, a multiplicative model is considered to address the issues related to standardized index namely: Standardized Mortality Ratio (SMR) and Comparative Mortality Factor (CMF). The model provides an alternative to conventional standardized technique. Maximum likelihood estimates of parameters of the model are used to construct an index similar to the SMR for estimating relative risk of exposure groups under comparison. Parametric Bootstrap resampling method is used to evaluate the goodness of fit of the model, behavior of estimated parameters and variability in relative risk on generated sample. The model provides an alternative to both direct and indirect standardization method. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research project is an extension of a series of administrative science and health care research projects evaluating the influence of external context, organizational strategy, and organizational structure upon organizational success or performance. The research will rely on the assumption that there is not one single best approach to the management of organizations (the contingency theory). As organizational effectiveness is dependent on an appropriate mix of factors, organizations may be equally effective based on differing combinations of factors. The external context of the organization is expected to influence internal organizational strategy and structure and in turn the internal measures affect performance (discriminant theory). The research considers the relationship of external context and organization performance.^ The unit of study for the research will be the health maintenance organization (HMO); an organization the accepts in exchange for a fixed, advance capitation payment, contractual responsibility to assure the delivery of a stated range of health sevices to a voluntary enrolled population. With the current Federal resurgence of interest in the Health Maintenance Organization (HMO) as a major component in the health care system, attention must be directed at maximizing development of HMOs from the limited resources available. Increased skills are needed in both Federal and private evaluation of HMO feasibility in order to prevent resource investment and in projects that will fail while concurrently identifying potentially successful projects that will not be considered using current standards.^ The research considers 192 factors measuring contextual milieu (social, educational, economic, legal, demographic, health and technological factors). Through intercorrelation and principle components data reduction techniques this was reduced to 12 variables. Two measures of HMO performance were identified, they are (1) HMO status (operational or defunct), and (2) a principle components factor score considering eight measures of performance. The relationship between HMO context and performance was analysed using correlation and stepwise multiple regression methods. In each case it has been concluded that the external contextual variables are not predictive of success or failure of study Health Maintenance Organizations. This suggests that performance of an HMO may rely on internal organizational factors. These findings have policy implications as contextual measures are used as a major determinant in HMO feasibility analysis, and as a factor in the allocation of limited Federal funds. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of this study was to determine the external validity of a clinical prediction rule developed by the European Multicenter Study on Human Spinal Cord Injury (EM-SCI) to predict the ambulation outcomes 12 months after traumatic spinal cord injury. Data from the North American Clinical Trials Network (NACTN) data registry with approximately 500 SCI cases were used for this validity study. The predictive accuracy of the EM-SCI prognostic model was evaluated using calibration and discrimination based on 231 NACTN cases. The area under the receiver-operating-characteristics curve (ROC) curve was 0.927 (95% CI 0.894 – 0.959) for the EM-SCI model when applied to NACTN population. This is lower than the AUC of 0.956 (95% CI 0.936 – 0.976) reported for the EM-SCI population, but suggests that the EM-SCI clinical prediction rule distinguished well between those patients in the NACTN population who were able to achieve independent ambulation and those who did not achieve independent ambulation. The calibration curve suggests that higher the prediction score is, the better the probability of walking with the best prediction for AIS D patients. In conclusion, the EM-SCI clinical prediction rule was determined to be generalizable to the adult NACTN SCI population.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Manuscript 1: “Conceptual Analysis: Externalizing Nursing Knowledge” We use concept analysis to establish that the report tool nurses prepare, carry, reference, amend, and use as a temporary data repository are examples of cognitive artifacts. This tool, integrally woven throughout the work and practice of nurses, is important to cognition and clinical decision-making. Establishing the tool as a cognitive artifact will support new dimensions of study. Such studies can characterize how this report tool supports cognition, internal representation of knowledge and skills, and external representation of knowledge of the nurse. Manuscript 2: “Research Methods: Exploring Cognitive Work” The purpose of this paper is to describe a complex, cross-sectional, multi-method approach to study of personal cognitive artifacts in the clinical environment. The complex data arrays present in these cognitive artifacts warrant the use of multiple methods of data collection. Use of a less robust research design may result in an incomplete understanding of the meaning, value, content, and relationships between personal cognitive artifacts in the clinical environment and the cognitive work of the user. Manuscript 3: “Making the Cognitive Work of Registered Nurses Visible” Purpose: Knowledge representations and structures are created and used by registered nurses to guide patient care. Understanding is limited regarding how these knowledge representations, or cognitive artifacts, contribute to working memory, prioritization, organization, cognition, and decision-making. The purpose of this study was to identify and characterize the role a specific cognitive artifact knowledge representation and structure as it contributed to the cognitive work of the registered nurse. Methods: Data collection was completed, using qualitative research methods, by shadowing and interviewing 25 registered nurses. Data analysis employed triangulation and iterative analytic processes. Results: Nurse cognitive artifacts support recall, data evaluation, decision-making, organization, and prioritization. These cognitive artifacts demonstrated spatial, longitudinal, chronologic, visual, and personal cues to support the cognitive work of nurses. Conclusions: Nurse cognitive artifacts are an important adjunct to the cognitive work of nurses, and directly support patient care. Nurses need to be able to configure their cognitive artifact in ways that are meaningful and support their internal knowledge representations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the popularity of the positron emitting glucose analog, ($\sp{18}$F) -2-deoxy-2-fluoro-D-glucose (2FDG), for the noninvasive "metabolic imaging" of organs with positron emission tomography (PET), the physiological basis for the tracer has not been tested, and the potential of 2FDG for the rapid kinetic analysis of altered glucose metabolism in the intact heart has not been fully exploited. We, therefore, developed a quantitative method to characterize metabolic changes of myocardial glucose metabolism noninvasively and with high temporal resolution.^ The first objective of the work was to provide direct evidence that the initial steps in the metabolism of 2FDG are the same as for glucose and that 2FDG is retained by the tissue in proportion to the rate of glucose utilization. The second objective was to characterize the kinetic changes in myocardial glucose transport and phosphorylation in response to changes in work load, competing substrates, acute ischemia and reperfusion, and the addition of insulin. To assess changes in myocardial glucose metabolism isolated working rat hearts were perfused with glucose and 2FDG. Tissue uptake of 2FDG and the input function were measured on-line by external detection. The steady state rate of 2FDG phosphorylation was determined by graphical analysis of 2FDG time-activity curves.^ The rate of 2FDG uptake was linear with time and the tracer was retained in its phosphorylated form. Tissue accumulation of 2FDG decreased within seconds with a reduction in work load, in the presence of competing substrates, and during reperfusion after global ischemia. Thus, most interventions known to alter glucose metabolism induced rapid parallel changes in 2FDG uptake. By contrast, insulin caused a significant increase in 2FDG accumulation only in hearts from fasted animals when perfused at a sub-physiological work load. The mechanism for this phenomenon is not known but may be related to the existence of two different glucose transporter systems and/or glycogen metabolism in the myocardial cell.^ It is concluded that (1) 2FDG traces glucose uptake and phosphorylation in the isolated working rat heart; and (2) early and transient kinetic changes in glucose metabolism can be monitored with high temporal resolution with 2FDG and a simple positron coincidence counting system. The new method has revealed transients of myocardial glucose metabolism, which would have remained unnoticed with conventional methods. These transients are not only important for the interpretation of glucose metabolic PET scans, but also provide insights into mechanisms of glucose transport and phosphorylation in heart muscle. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores the idea that fear of floating can be justified as an optimal discretionary monetary policy in a dollarized emerging economy. Specifically, I consider a small open economy in which intermediate goods importers borrow in foreign currency and face a credit constraint. In this economy, exchange rate depreciation not only worsens importers' net-worth but also increases the financing amount in domestic currency, therefore exaggerating their borrowing finance premium. Besides, because of high exchange rate pass-through into import prices, fluctuations in the exchange rate also have strong impacts on domestic prices and production. These effects, together, magnify the macroeconomic consequences of the floating exchange rate policy in response to external shocks. The paper shows that the floating exchange rate regime is dominated by the fixed exchange rate regime in the role of cushioning shocks and in welfare terms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El cálculo de relaciones binarias fue creado por De Morgan en 1860 para ser posteriormente desarrollado en gran medida por Peirce y Schröder. Tarski, Givant, Freyd y Scedrov demostraron que las álgebras relacionales son capaces de formalizar la lógica de primer orden, la lógica de orden superior así como la teoría de conjuntos. A partir de los resultados matemáticos de Tarski y Freyd, esta tesis desarrolla semánticas denotacionales y operacionales para la programación lógica con restricciones usando el álgebra relacional como base. La idea principal es la utilización del concepto de semántica ejecutable, semánticas cuya característica principal es el que la ejecución es posible utilizando el razonamiento estándar del universo semántico, este caso, razonamiento ecuacional. En el caso de este trabajo, se muestra que las álgebras relacionales distributivas con un operador de punto fijo capturan toda la teoría y metateoría estándar de la programación lógica con restricciones incluyendo los árboles utilizados en la búsqueda de demostraciones. La mayor parte de técnicas de optimización de programas, evaluación parcial e interpretación abstracta pueden ser llevadas a cabo utilizando las semánticas aquí presentadas. La demostración de la corrección de la implementación resulta extremadamente sencilla. En la primera parte de la tesis, un programa lógico con restricciones es traducido a un conjunto de términos relacionales. La interpretación estándar en la teoría de conjuntos de dichas relaciones coincide con la semántica estándar para CLP. Las consultas contra el programa traducido son llevadas a cabo mediante la reescritura de relaciones. Para concluir la primera parte, se demuestra la corrección y equivalencia operacional de esta nueva semántica, así como se define un algoritmo de unificación mediante la reescritura de relaciones. La segunda parte de la tesis desarrolla una semántica para la programación lógica con restricciones usando la teoría de alegorías—versión categórica del álgebra de relaciones—de Freyd. Para ello, se definen dos nuevos conceptos de Categoría Regular de Lawvere y _-Alegoría, en las cuales es posible interpretar un programa lógico. La ventaja fundamental que el enfoque categórico aporta es la definición de una máquina categórica que mejora e sistema de reescritura presentado en la primera parte. Gracias al uso de relaciones tabulares, la máquina modela la ejecución eficiente sin salir de un marco estrictamente formal. Utilizando la reescritura de diagramas, se define un algoritmo para el cálculo de pullbacks en Categorías Regulares de Lawvere. Los dominios de las tabulaciones aportan información sobre la utilización de memoria y variable libres, mientras que el estado compartido queda capturado por los diagramas. La especificación de la máquina induce la derivación formal de un juego de instrucciones eficiente. El marco categórico aporta otras importantes ventajas, como la posibilidad de incorporar tipos de datos algebraicos, funciones y otras extensiones a Prolog, a la vez que se conserva el carácter 100% declarativo de nuestra semántica. ABSTRACT The calculus of binary relations was introduced by De Morgan in 1860, to be greatly developed by Peirce and Schröder, as well as many others in the twentieth century. Using different formulations of relational structures, Tarski, Givant, Freyd, and Scedrov have shown how relation algebras can provide a variable-free way of formalizing first order logic, higher order logic and set theory, among other formal systems. Building on those mathematical results, we develop denotational and operational semantics for Constraint Logic Programming using relation algebra. The idea of executable semantics plays a fundamental role in this work, both as a philosophical and technical foundation. We call a semantics executable when program execution can be carried out using the regular theory and tools that define the semantic universe. Throughout this work, the use of pure algebraic reasoning is the basis of denotational and operational results, eliminating all the classical non-equational meta-theory associated to traditional semantics for Logic Programming. All algebraic reasoning, including execution, is performed in an algebraic way, to the point we could state that the denotational semantics of a CLP program is directly executable. Techniques like optimization, partial evaluation and abstract interpretation find a natural place in our algebraic models. Other properties, like correctness of the implementation or program transformation are easy to check, as they are carried out using instances of the general equational theory. In the first part of the work, we translate Constraint Logic Programs to binary relations in a modified version of the distributive relation algebras used by Tarski. Execution is carried out by a rewriting system. We prove adequacy and operational equivalence of the semantics. In the second part of the work, the relation algebraic approach is improved by using allegory theory, a categorical version of the algebra of relations developed by Freyd and Scedrov. The use of allegories lifts the semantics to typed relations, which capture the number of logical variables used by a predicate or program state in a declarative way. A logic program is interpreted in a _-allegory, which is in turn generated from a new notion of Regular Lawvere Category. As in the untyped case, program translation coincides with program interpretation. Thus, we develop a categorical machine directly from the semantics. The machine is based on relation composition, with a pullback calculation algorithm at its core. The algorithm is defined with the help of a notion of diagram rewriting. In this operational interpretation, types represent information about memory allocation and the execution mechanism is more efficient, thanks to the faithful representation of shared state by categorical projections. We finish the work by illustrating how the categorical semantics allows the incorporation into Prolog of constructs typical of Functional Programming, like abstract data types, and strict and lazy functions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Metal grid lines are a vital element in multijunction solar cells in order to take out from the cell the generated photocurrent. Nevertheless all this implies certain shadowing factor and thus certain reflectivity on cells surface that lowers its light absorption. This reflectivity produces a loss in electrical efficiency and thus a loss in global energy production for CPV systems. We present here an optical design for recovering this portion of reflected light, and thus leading to a system efficiency increase. This new design is based on an external confinement cavity, an optical element able to redirect the light reflected by the cell towards its surface again. It has been possible thanks to the recent invention of the advanced Köhler concentrators by LPI, likely to integrate one of these cavities easily. We have proven the excellent performance of these cavities integrated in this kind of CPV modules offering outstanding results: 33.2% module electrical efficiency @Tcell=25ºC and relative efficiency and Isc gains of over 6%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multijunction solar cells present a certain reflectivity on its surface that lowers its light absorption. This reflectivity produces a loss in electrical efficiency and thus a loss in global energy production for CPV systems. We present here an optical design for recovering this portion of reflected light, and thus leading to a system efficiency increase. This new design is based on an external confinement cavity, an optical element able to redirect the light reflected by the cell towards its surface again. We have proven the excellent performance of these cavities integrated in CPV modules offering outstanding results: 33.2% module electrical efficiency @Tcell  =  25 °C and relative efficiency and Isc gains of over 6%

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Irregular computations pose sorne of the most interesting and challenging problems in automatic parallelization. Irregularity appears in certain kinds of numerical problems and is pervasive in symbolic applications. Such computations often use dynamic data structures, which make heavy use of pointers. This complicates all the steps of a parallelizing compiler, from independence detection to task partitioning and placement. Starting in the mid 80s there has been significant progress in the development of parallelizing compilers for logic pro­gramming (and more recently, constraint programming) resulting in quite capable paralle­lizers. The typical applications of these paradigms frequently involve irregular computations, and make heavy use of dynamic data structures with pointers, since logical variables represent in practice a well-behaved form of pointers. This arguably makes the techniques used in these compilers potentially interesting. In this paper, we introduce in a tutoríal way, sorne of the problems faced by parallelizing compilers for logic and constraint programs and provide pointers to sorne of the significant progress made in the area. In particular, this work has resulted in a series of achievements in the areas of inter-procedural pointer aliasing analysis for independence detection, cost models and cost analysis, cactus-stack memory management, techniques for managing speculative and irregular computations through task granularity control and dynamic task allocation such as work-stealing schedulers), etc.