892 resultados para Optimized allocation
Resumo:
BACKGROUND AND AIMS The Barcelona Clinic Liver Cancer (BCLC) staging system is the algorithm most widely used to manage patients with hepatocellular carcinoma (HCC). We aimed to investigate the extent to which the BCLC recommendations effectively guide clinical practice and assess the reasons for any deviation from the recommendations. MATERIAL AND METHODS The first-line treatments assigned to patients included in the prospective Bern HCC cohort were analyzed. RESULTS Among 223 patients included in the cohort, 116 were not treated according to the BCLC algorithm. Eighty percent of the patients in BCLC stage 0 (very early HCC) and 60% of the patients in BCLC stage A (early HCC) received recommended curative treatment. Only 29% of the BCLC stage B patients (intermediate HCC) and 33% of the BCLC stage C patients (advanced HCC) were treated according to the algorithm. Eighty-nine percent of the BCLC stage D patients (terminal HCC) were treated with best supportive care, as recommended. In 98 patients (44%) the performance status was disregarded in the stage assignment. CONCLUSION The management of HCC in clinical practice frequently deviates from the BCLC recommendations. Most of the curative therapy options, which have well-defined selection criteria, were allocated according to the recommendations, while the majority of the palliative therapy options were assigned to patients with tumor stages not aligned with the recommendations. The only parameter which is subjective in the algorithm, the performance status, is also the least respected.
Resumo:
Pharmacokinetic and pharmacodynamic properties of a chiral drug can significantly differ between application of the racemate and single enantiomers. During drug development, the characteristics of candidate compounds have to be assessed prior to clinical testing. Since biotransformation significantly influences drug actions in an organism, metabolism studies represent a crucial part of such tests. Hence, an optimized and economical capillary electrophoretic method for on-line studies of the enantioselective drug metabolism mediated by cytochrome P450 enzymes was developed. It comprises a diffusion-based procedure, which enables mixing of the enzyme with virtually any compound inside the nanoliter-scale capillary reactor and without the need of additional optimization of mixing conditions. For CYP3A4, ketamine as probe substrate and highly sulfated γ-cyclodextrin as chiral selector, improved separation conditions for ketamine and norketamine enantiomers compared to a previously published electrophoretically mediated microanalysis method were elucidated. The new approach was thoroughly validated for the CYP3A4-mediated N-demethylation pathway of ketamine and applied to the determination of its kinetic parameters and the inhibition characteristics in presence of ketoconazole and dexmedetomidine. The determined parameters were found to be comparable to literature data obtained with different techniques. The presented method constitutes a miniaturized and cost-effective tool, which should be suitable for the assessment of the stereoselective aspects of kinetic and inhibition studies of cytochrome P450-mediated metabolic steps within early stages of the development of a new drug.
Resumo:
This study attempts to analyze the underlying factors and motives influencing the allocation of discretionary state expenditures. The fact that some cities receive more money than other cities begs the question of what accounts for this variation. After framing the provision of state money within the theoretical framework of political patronage, a case study of Governor Rowland’s tenure in office and the accompanying expenditures to Connecticut’s 17 largest cities from 1995 to 2004 was conducted to evaluate whether a disproportionate amount of money was given to Rowland’s hometown of Waterbury, Connecticut. Besides employing a statistical analysis that determined that cities with similar characteristics received different amounts of money, interviewing was conducted to identify reasons for such variation. The results indicate that Waterbury received a greater amount of money than was predicted based on the city’s economic and demographic characteristics, and that non-objective and biased factors such as favoritism, the need to reward political support, or the desire to increase political loyalty sometimes take precedence over more objective factors.
Resumo:
Coccolithophores are unicellular phytoplankton that produce calcium carbonate coccoliths as an exoskeleton. Emiliania huxleyi, the most abundant coccolithophore in the world's ocean, plays a major role in the global carbon cycle by regulating the exchange of CO2 across the ocean-atmosphere interface through photosynthesis and calcium carbonate precipitation. As CO2 concentration is rising in the atmosphere, the ocean is acidifying and ammonium (NH4) concentration of future ocean water is expected to rise. The latter is attributed to increasing anthropogenic nitrogen (N) deposition, increasing rates of cyanobacterial N2 fixation due to warmer and more stratified oceans, and decreased rates of nitrification due to ocean acidification. Thus future global climate change will cause oceanic phytoplankton to experience changes in multiple environmental parameters including CO2, pH, temperature and nitrogen source. This study reports on the combined effect of elevated pCO2 and increased NH4 to nitrate (NO3) ratio (NH4/NO3) on E. huxleyi, maintained in continuous cultures for more than 200 generations under two pCO2 levels and two different N sources. Here we show that NH4 assimilation under N-replete conditions depresses calcification at both low and high pCO2, alters coccolith morphology, and increases primary production. We observed that N source and pCO2 synergistically drive growth rates, cell size and the ratio of inorganic to organic carbon. These responses to N source suggest that, compared to increasing CO2 alone, a greater disruption of the organic carbon pump could be expected in response to the combined effect of increased NH4/NO3 ratio and CO2 level in the future acidified ocean. Additional experiments conducted under lower nutrient conditions are needed prior to extrapolating our findings to the global oceans. Nonetheless, our results emphasize the need to assess combined effects of multiple environmental parameters on phytoplankton biology in order to develop accurate predictions of phytoplankton responses to ocean acidification.
Resumo:
Climate change, including ocean acidification (OA), presents fundamental challenges to marine biodiversity and sustained ecosystem health. We determined reproductive response (measured as naupliar production), cuticle composition and stage specific growth of the copepod Tisbe battagliai over three generations at four pH conditions (pH 7.67, 7.82, 7.95, and 8.06). Naupliar production increased significantly at pH 7.95 compared with pH 8.06 followed by a decline at pH 7.82. Naupliar production at pH 7.67 was higher than pH 7.82. We attribute the increase at pH 7.95 to an initial stress response which was succeeded by a hormesis-like response at pH 7.67. A multi-generational modelling approach predicted a gradual decline in naupliar production over the next 100 years (equivalent to approximately 2430 generations). There was a significant growth reduction (mean length integrated across developmental stage) relative to controls. There was a significant increase in the proportion of carbon relative to oxygen within the cuticle as seawater pH decreased. Changes in growth, cuticle composition and naupliar production strongly suggest that copepods subjected to OA-induced stress preferentially reallocate resources towards maintaining reproductive output at the expense of somatic growth and cuticle composition. These responses may drive shifts in life history strategies that favour smaller brood sizes, females and perhaps later maturing females, with the potential to profoundly destabilise marine trophodynamics.
Resumo:
Energy is required to maintain physiological homeostasis in response to environmental change. Although responses to environmental stressors frequently are assumed to involve high metabolic costs, the biochemical bases of actual energy demands are rarely quantified. We studied the impact of a near-future scenario of ocean acidification [800 µatm partial pressure of CO2 (pCO2)] during the development and growth of an important model organism in developmental and environmental biology, the sea urchin Strongylocentrotus purpuratus. Size, metabolic rate, biochemical content, and gene expression were not different in larvae growing under control and seawater acidification treatments. Measurements limited to those levels of biological analysis did not reveal the biochemical mechanisms of response to ocean acidification that occurred at the cellular level. In vivo rates of protein synthesis and ion transport increased 50% under acidification. Importantly, the in vivo physiological increases in ion transport were not predicted from total enzyme activity or gene expression. Under acidification, the increased rates of protein synthesis and ion transport that were sustained in growing larvae collectively accounted for the majority of available ATP (84%). In contrast, embryos and prefeeding and unfed larvae in control treatments allocated on average only 40% of ATP to these same two processes. Understanding the biochemical strategies for accommodating increases in metabolic energy demand and their biological limitations can serve as a quantitative basis for assessing sublethal effects of global change. Variation in the ability to allocate ATP differentially among essential functions may be a key basis of resilience to ocean acidification and other compounding environmental stressors.
Resumo:
Anthropogenic CO2 emission will lead to an increase in seawater pCO2 of up to 80-100 Pa (800-1000 µatm) within this century and to an acidification of the oceans. Green sea urchins (Strongylocentrotus droebachiensis) occurring in Kattegat experience seasonal hypercapnic and hypoxic conditions already today. Thus, anthropogenic CO2 emissions will add up to existing values and will lead to even higher pCO2 values >200 Pa (>2000 µatm). To estimate the green sea urchins' potential to acclimate to acidified seawater, we calculated an energy budget and determined the extracellular acid base status of adult S. droebachiensis exposed to moderately (102 to 145 Pa, 1007 to 1431 µatm) and highly (284 to 385 Pa, 2800 to 3800 µatm) elevated seawater pCO2 for 10 and 45 days. A 45 - day exposure to elevated pCO2 resulted in a shift in energy budgets, leading to reduced somatic and reproductive growth. Metabolic rates were not significantly affected, but ammonium excretion increased in response to elevated pCO2. This led to decreased O:N ratios. These findings suggest that protein metabolism is possibly enhanced under elevated pCO2 in order to support ion homeostasis by increasing net acid extrusion. The perivisceral coelomic fluid acid-base status revealed that S. droebachiensis is able to fully (intermediate pCO2) or partially (high pCO2) compensate extracellular pH (pHe) changes by accumulation of bicarbonate (maximum increases 2.5 mM), albeit at a slower rate than typically observed in other taxa (10 day duration for full pHe compensation). At intermediate pCO2, sea urchins were able to maintain fully compensated pHe for 45 days. Sea urchins from the higher pCO2 treatment could be divided into two groups following medium-term acclimation: one group of experimental animals (29%) contained remnants of food in their digestive system and maintained partially compensated pHe (+2.3 mM HCO3), while the other group (71%) exhibited an empty digestive system and a severe metabolic acidosis (-0.5 pH units, -2.4 mM HCO3). There was no difference in mortality between the three pCO2 treatments. The results of this study suggest that S. droebachiensis occurring in the Kattegat might be pre-adapted to hypercapnia due to natural variability in pCO2 in its habitat. We show for the first time that some echinoderm species can actively compensate extracellular pH. Seawater pCO2 values of >200 Pa, which will occur in the Kattegat within this century during seasonal hypoxic events, can possibly only be endured for a short time period of a few weeks. Increases in anthropogenic CO2 emissions and leakages from potential sub-seabed CO2 storage (CCS) sites thus impose a threat to the ecologically and economically important species S. droebachiensis.
Resumo:
Using a unique dataset obtained from rural Andhra Pradesh, India that contains direct observations of household access to credit and detailed time use, results of this study indicate that credit market failures lead to a substantial reallocation of time used by children for activities such as schooling, household chores, remunerative work, and leisure. The negative effects of credit constraints on schooling amount to a 60% decrease of average schooling time. However, the magnitude of decrease due to credit constraints is about half that of the increase in both domestic and remunerative child labor, the other half appearing to come from a reduction in leisure.
Resumo:
This paper develops a quantitative measure of allocation efficiency, which is an extension of the dynamic Olley-Pakes productivity decomposition proposed by Melitz and Polanec (2015). The extended measure enables the simultaneous capture of the degree of misallocation within a group and between groups and parallel to capturing the contribution of entering and exiting firms to aggregate productivity growth. This measure empirically assesses the degree of misallocation in China using manufacturing firm-level data from 2004 to 2007. Misallocation among industrial sectors has been found to increase over time, and allocation efficiency within an industry has been found to worsen in industries that use more capital and have firms with relatively higher state-owned market shares. Allocation efficiency among three ownership sectors (state-owned, domestic private, and foreign sectors) tends to improve in industries wherein the market share moves from a less-productive state-owned sector to a more productive private sector.
Resumo:
Distributed real-time embedded systems are becoming increasingly important to society. More demands will be made on them and greater reliance will be placed on the delivery of their services. A relevant subset of them is high-integrity or hard real-time systems, where failure can cause loss of life, environmental harm, or significant financial loss. Additionally, the evolution of communication networks and paradigms as well as the necessity of demanding processing power and fault tolerance, motivated the interconnection between electronic devices; many of the communications have the possibility of transferring data at a high speed. The concept of distributed systems emerged as systems where different parts are executed on several nodes that interact with each other via a communication network. Java’s popularity, facilities and platform independence have made it an interesting language for the real-time and embedded community. This was the motivation for the development of RTSJ (Real-Time Specification for Java), which is a language extension intended to allow the development of real-time systems. The use of Java in the development of high-integrity systems requires strict development and testing techniques. However, RTJS includes a number of language features that are forbidden in such systems. In the context of the HIJA project, the HRTJ (Hard Real-Time Java) profile was developed to define a robust subset of the language that is amenable to static analysis for high-integrity system certification. Currently, a specification under the Java community process (JSR- 302) is being developed. Its purpose is to define those capabilities needed to create safety critical applications with Java technology called Safety Critical Java (SCJ). However, neither RTSJ nor its profiles provide facilities to develop distributed realtime applications. This is an important issue, as most of the current and future systems will be distributed. The Distributed RTSJ (DRTSJ) Expert Group was created under the Java community process (JSR-50) in order to define appropriate abstractions to overcome this problem. Currently there is no formal specification. The aim of this thesis is to develop a communication middleware that is suitable for the development of distributed hard real-time systems in Java, based on the integration between the RMI (Remote Method Invocation) model and the HRTJ profile. It has been designed and implemented keeping in mind the main requirements such as the predictability and reliability in the timing behavior and the resource usage. iThe design starts with the definition of a computational model which identifies among other things: the communication model, most appropriate underlying network protocols, the analysis model, and a subset of Java for hard real-time systems. In the design, the remote references are the basic means for building distributed applications which are associated with all non-functional parameters and resources needed to implement synchronous or asynchronous remote invocations with real-time attributes. The proposed middleware separates the resource allocation from the execution itself by defining two phases and a specific threading mechanism that guarantees a suitable timing behavior. It also includes mechanisms to monitor the functional and the timing behavior. It provides independence from network protocol defining a network interface and modules. The JRMP protocol was modified to include two phases, non-functional parameters, and message size optimizations. Although serialization is one of the fundamental operations to ensure proper data transmission, current implementations are not suitable for hard real-time systems and there are no alternatives. This thesis proposes a predictable serialization that introduces a new compiler to generate optimized code according to the computational model. The proposed solution has the advantage of allowing us to schedule the communications and to adjust the memory usage at compilation time. In order to validate the design and the implementation a demanding validation process was carried out with emphasis in the functional behavior, the memory usage, the processor usage (the end-to-end response time and the response time in each functional block) and the network usage (real consumption according to the calculated consumption). The results obtained in an industrial application developed by Thales Avionics (a Flight Management System) and in exhaustive tests show that the design and the prototype are reliable for industrial applications with strict timing requirements. Los sistemas empotrados y distribuidos de tiempo real son cada vez más importantes para la sociedad. Su demanda aumenta y cada vez más dependemos de los servicios que proporcionan. Los sistemas de alta integridad constituyen un subconjunto de gran importancia. Se caracterizan por que un fallo en su funcionamiento puede causar pérdida de vidas humanas, daños en el medio ambiente o cuantiosas pérdidas económicas. La necesidad de satisfacer requisitos temporales estrictos, hace más complejo su desarrollo. Mientras que los sistemas empotrados se sigan expandiendo en nuestra sociedad, es necesario garantizar un coste de desarrollo ajustado mediante el uso técnicas adecuadas en su diseño, mantenimiento y certificación. En concreto, se requiere una tecnología flexible e independiente del hardware. La evolución de las redes y paradigmas de comunicación, así como la necesidad de mayor potencia de cómputo y de tolerancia a fallos, ha motivado la interconexión de dispositivos electrónicos. Los mecanismos de comunicación permiten la transferencia de datos con alta velocidad de transmisión. En este contexto, el concepto de sistema distribuido ha emergido como sistemas donde sus componentes se ejecutan en varios nodos en paralelo y que interactúan entre ellos mediante redes de comunicaciones. Un concepto interesante son los sistemas de tiempo real neutrales respecto a la plataforma de ejecución. Se caracterizan por la falta de conocimiento de esta plataforma durante su diseño. Esta propiedad es relevante, por que conviene que se ejecuten en la mayor variedad de arquitecturas, tienen una vida media mayor de diez anos y el lugar ˜ donde se ejecutan puede variar. El lenguaje de programación Java es una buena base para el desarrollo de este tipo de sistemas. Por este motivo se ha creado RTSJ (Real-Time Specification for Java), que es una extensión del lenguaje para permitir el desarrollo de sistemas de tiempo real. Sin embargo, RTSJ no proporciona facilidades para el desarrollo de aplicaciones distribuidas de tiempo real. Es una limitación importante dado que la mayoría de los actuales y futuros sistemas serán distribuidos. El grupo DRTSJ (DistributedRTSJ) fue creado bajo el proceso de la comunidad de Java (JSR-50) con el fin de definir las abstracciones que aborden dicha limitación, pero en la actualidad aun no existe una especificacion formal. El objetivo de esta tesis es desarrollar un middleware de comunicaciones para el desarrollo de sistemas distribuidos de tiempo real en Java, basado en la integración entre el modelo de RMI (Remote Method Invocation) y el perfil HRTJ. Ha sido diseñado e implementado teniendo en cuenta los requisitos principales, como la predecibilidad y la confiabilidad del comportamiento temporal y el uso de recursos. El diseño parte de la definición de un modelo computacional el cual identifica entre otras cosas: el modelo de comunicaciones, los protocolos de red subyacentes más adecuados, el modelo de análisis, y un subconjunto de Java para sistemas de tiempo real crítico. En el diseño, las referencias remotas son el medio básico para construcción de aplicaciones distribuidas las cuales son asociadas a todos los parámetros no funcionales y los recursos necesarios para la ejecución de invocaciones remotas síncronas o asíncronas con atributos de tiempo real. El middleware propuesto separa la asignación de recursos de la propia ejecución definiendo dos fases y un mecanismo de hebras especifico que garantiza un comportamiento temporal adecuado. Además se ha incluido mecanismos para supervisar el comportamiento funcional y temporal. Se ha buscado independencia del protocolo de red definiendo una interfaz de red y módulos específicos. También se ha modificado el protocolo JRMP para incluir diferentes fases, parámetros no funcionales y optimizaciones de los tamaños de los mensajes. Aunque la serialización es una de las operaciones fundamentales para asegurar la adecuada transmisión de datos, las actuales implementaciones no son adecuadas para sistemas críticos y no hay alternativas. Este trabajo propone una serialización predecible que ha implicado el desarrollo de un nuevo compilador para la generación de código optimizado acorde al modelo computacional. La solución propuesta tiene la ventaja que en tiempo de compilación nos permite planificar las comunicaciones y ajustar el uso de memoria. Con el objetivo de validar el diseño e implementación se ha llevado a cabo un exigente proceso de validación con énfasis en: el comportamiento funcional, el uso de memoria, el uso del procesador (tiempo de respuesta de extremo a extremo y en cada uno de los bloques funcionales) y el uso de la red (consumo real conforme al estimado). Los buenos resultados obtenidos en una aplicación industrial desarrollada por Thales Avionics (un sistema de gestión de vuelo) y en las pruebas exhaustivas han demostrado que el diseño y el prototipo son fiables para aplicaciones industriales con estrictos requisitos temporales.
Resumo:
The introduction of a low-temperature (LT) tail after P emitter diffusion was shown to lead to considerable improvements in electron lifetime and solar cell performance by different researchers. So far, the drawback of the investigated extended gettering treatments has been the lack of knowledge about optimum annealing times and temperatures and the important increase in processing time. In this manuscript, we calculate optimum annealing temperatures of Fe-contaminated Si wafers for different annealing durations. Subsequently, it is shown theoretically and experimentally that a relatively short LT tail of 15 min can lead to a significant reduction of interstitial Fe and an increase in electron lifetime. Finally, we calculate the potential improvement of solar cell efficiency when such a short-tail extended P diffusion gettering is included in an industrial fabrication process.
Resumo:
Nonparametric belief propagation (NBP) is a well-known particle-based method for distributed inference in wireless networks. NBP has a large number of applications, including cooperative localization. However, in loopy networks NBP suffers from similar problems as standard BP, such as over-confident beliefs and possible nonconvergence. Tree-reweighted NBP (TRW-NBP) can mitigate these problems, but does not easily lead to a distributed implementation due to the non-local nature of the required so-called edge appearance probabilities. In this paper, we propose a variation of TRWNBP, suitable for cooperative localization in wireless networks. Our algorithm uses a fixed edge appearance probability for every edge, and can outperform standard NBP in dense wireless networks.
Resumo:
El presente trabajo propone un método para la determinación de los valores de las tolerancias individuales de las piezas que forman un conjunto ensamblado a partir de valores de tolerancias especificados en el conjunto final, optimizando el coste total de fabricación de las piezas individuales a partir de funciones de coste-tolerancia basadas en el proceso de fabricación de cada una de ellas. Para ello se parte de los principales trabajos desarrollados en la línea de asignación de tolerancias y se realiza la propuesta del modelo de trabajo, basado en la optimización de costes a partir de la aplicación del método de los multiplicadores de Lagrange a diversas curvas de coste-tolerancia