931 resultados para elliptic wedge generators


Relevância:

10.00% 10.00%

Publicador:

Resumo:

El proyecto de investigación parte de la dinámica del modelo de distribución tercerizada para una compañía de consumo masivo en Colombia, especializada en lácteos, que para este estudio se ha denominado “Lactosa”. Mediante datos de panel con estudio de caso, se construyen dos modelos de demanda por categoría de producto y distribuidor y mediante simulación estocástica, se identifican las variables relevantes que inciden sus estructuras de costos. El problema se modela a partir del estado de resultados por cada uno de los cuatro distribuidores analizados en la región central del país. Se analiza la estructura de costos y el comportamiento de ventas dado un margen (%) de distribución logístico, en función de las variables independientes relevantes, y referidas al negocio, al mercado y al entorno macroeconómico, descritas en el objeto de estudio. Entre otros hallazgos, se destacan brechas notorias en los costos de distribución y costos en la fuerza de ventas, pese a la homogeneidad de segmentos. Identifica generadores de valor y costos de mayor dispersión individual y sugiere uniones estratégicas de algunos grupos de distribuidores. La modelación con datos de panel, identifica las variables relevantes de gestión que inciden sobre el volumen de ventas por categoría y distribuidor, que focaliza los esfuerzos de la dirección. Se recomienda disminuir brechas y promover desde el productor estrategias focalizadas a la estandarización de procesos internos de los distribuidores; promover y replicar los modelos de análisis, sin pretender remplazar conocimiento de expertos. La construcción de escenarios fortalece de manera conjunta y segura la posición competitiva de la compañía y sus distribuidores.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El trasplante de órganos es considerado uno de los avances más significativos de la medicina moderna y es un procedimiento cada vez más exitoso en términos de supervivencia de los pacientes, siendo actualmente la mejor opción de tratamiento para los pacientes con innumerables patologías. El proceso de donación es insuficiente para cubrir las necesidades de trasplante de la población, por lo tanto, se hace necesario el desarrollo de nuevas estrategias para fortalecer la experiencia y efectividad de los programas existentes. La falta de conocimiento de los profesionales de la salud, su percepción y actitud hacia temas relacionados con el proceso de donación, pueden convertirlos en facilitadores o barreras para la identificación de potenciales donantes. Por esta razón, los recursos disponibles, las actitudes hacia la donación, la legislación y conocimiento de los procesos involucrados en la donación de tejidos y órganos son críticos. Dada la influencia de los profesionales de salud se definen los objetivos de este proyecto de tesis: determinar cuál es el conocimiento y las habilidades de los profesionales de la salud encargados de los trasplantes de órganos y de tejidos en la regional 1, evaluados mediante una herramienta educativa para contribuir a mejorar un programa eficiente de Donación de Órganos y tejidos y así mismo, fijar recomendaciones en aras de aumentar las tasas de donación, con especial énfasis en la actividad hospitalaria en el país. METODOLOGIA Se realizó un estudio basado en el análisis de la evaluación de conocimientos del proceso donación- trasplante de órganos y tejidos en el personal de salud participante en la herramienta educativa llamada “Curso taller primer respondiente del potencial donante de órganos y tejidos”. Este curso incluía un formato evaluativo que fue diligenciado de manera anónima por los participantes antes y después de recibir el contenido del curso. El estudio se desarrolló en personal de la Salud de IPS pertenecientes a la Regional I, de la Red Nacional de donación y trasplantes de órganos y tejidos. Con el fin de evidenciar si existen diferencias en el conocimiento de los participantes del curso antes y después de asistir al mismo, se utilizó la prueba de McNemar (p< 0.05). RESULTADOS Entre julio del 2011 y junio del 2012, se realizó el “Curso taller primer respondiente del potencial donante de órganos y tejidos” y se obtuvieron 303 encuestados incluidos médicos, enfermeras y auxiliares de enfermería. Al inicio del curso las respuestas acertadas con relación a legislación, selección del donante, muerte encefálica y mantenimiento del donante estuvieron alrededor del 50%. No fue posible detectar la profesión que pudiese generar riesgo en la detección del donante y los procesos asociados. Posterior al curso, el 72% de las preguntas se respondieron de manera correcta, lo que representa un incremento estadísticamente significativo. Este cambio evidenció significancia estadística al usar la prueba de McNemar y arrojar un valor de p=0.00. .DISCUSIÓN El personal de salud participante en el curso taller proveniente de unidades involucradas como generadoras de donantes muestra un déficit de conocimientos del proceso donación trasplantes lo que puede convertirlos en limitantes para dicho proceso

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introducción. En el presente trabajo se pretende identificar los factores psicosociales laborales asociados con el bienestar del trabajador en investigaciones realizadas en Colombia y España, durante el periodo 2002 – 2012. Objetivo. Este trabajo tiene como fin, precisar sobre el desarrollo investigativo en lo referente a los factores psicosociales y su relación con el bienestar, de los trabajadores en Colombia y España durante el período 2002-2012, por medio de los estudios encontrados sobre factores psicosociales y su impacto benéfico en el bienestar del trabajador, marco legal de ambos países, así como también, la revisión documental, consolidación y posterior análisis de la literatura, en torno al estado del arte del presente estudio en relación a los factores psicosociales laborales. Método. Se trata de un estudio documental, realizado por medio, de una revisión de literatura en las bases de datos y posterior selección, clasificación, consolidación, sistematización y análisis de los estudios de investigación encontrados, los cuales, analizaban aspectos relacionados con los factores psicosociales y su relación con el bienestar del trabajador en Colombia y España, durante el período 2002-2012. Resultados. En la revisión documental se evidenció que los estudios referentes a los factores psicosociales y su relación con el bienestar del trabajador, representa un importante y permanente reto para las organizaciones. De la misma manera, se destacan los avances que sobre dicha relación presenta España, pues, en Colombia, aun los estudios siguen direccionados hacia los factores de riesgo o perjudiciales, más que hacia factores protectores o de bienestar, generadores de un efecto benéfico en los trabajadores y por ende en la organización.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En la búsqueda de preservar el medio ambiente y estandarizar la disposición final de los residuos generados, Sustain Cycle nace como un gestor de residuos de aceite vegetal usado, reforzando la carente oferta del mercado. Fortaleciendo el compromiso de los generadores de este residuo además de una reducción de costos, la creación de Sustain Cycle se desarrolla bajo la figura jurídica de Fundación, deduciendo en los contribuyentes del impuesto el valor de las donaciones efectuadas. El aporte ambiental se basa en minimizar el riesgo de una mala disposición en la red de alcantarillado y su componente social se enfoca en evitar la reutilización ilegal y perjudicial para el consumidor. Sustain Cycle se centra en la recolección, acopio, filtrado y comercialización del AVU generado por todos los establecimientos comerciales que produzcan alimentos como papas a la francesa, empanadas, buñuelos, churros, plátanos y demás productos que requieran del aceite para su cocción.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The performance of a model-based diagnosis system could be affected by several uncertainty sources, such as,model errors,uncertainty in measurements, and disturbances. This uncertainty can be handled by mean of interval models.The aim of this thesis is to propose a methodology for fault detection, isolation and identification based on interval models. The methodology includes some algorithms to obtain in an automatic way the symbolic expression of the residual generators enhancing the structural isolability of the faults, in order to design the fault detection tests. These algorithms are based on the structural model of the system. The stages of fault detection, isolation, and identification are stated as constraint satisfaction problems in continuous domains and solved by means of interval based consistency techniques. The qualitative fault isolation is enhanced by a reasoning in which the signs of the symptoms are derived from analytical redundancy relations or bond graph models of the system. An initial and empirical analysis regarding the differences between interval-based and statistical-based techniques is presented in this thesis. The performance and efficiency of the contributions are illustrated through several application examples, covering different levels of complexity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider a class of boundary integral equations that arise in the study of strongly elliptic BVPs in unbounded domains of the form $D = \{(x, z)\in \mathbb{R}^{n+1} : x\in \mathbb{R}^n, z > f(x)\}$ where $f : \mathbb{R}^n \to\mathbb{R}$ is a sufficiently smooth bounded and continuous function. A number of specific problems of this type, for example acoustic scattering problems, problems involving elastic waves, and problems in potential theory, have been reformulated as second kind integral equations $u+Ku = v$ in the space $BC$ of bounded, continuous functions. Having recourse to the so-called limit operator method, we address two questions for the operator $A = I + K$ under consideration, with an emphasis on the function space setting $BC$. Firstly, under which conditions is $A$ a Fredholm operator, and, secondly, when is the finite section method applicable to $A$?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

These notes have been issued on a small scale in 1983 and 1987 and on request at other times. This issue follows two items of news. First, WaIter Colquitt and Luther Welsh found the 'missed' Mersenne prime M110503 and advanced the frontier of complete Mp-testing to 139,267. In so doing, they terminated Slowinski's significant string of four consecutive Mersenne primes. Secondly, a team of five established a non-Mersenne number as the largest known prime. This result terminated the 1952-89 reign of Mersenne primes. All the original Mersenne numbers with p < 258 were factorised some time ago. The Sandia Laboratories team of Davis, Holdridge & Simmons with some little assistance from a CRAY machine cracked M211 in 1983 and M251 in 1984. They contributed their results to the 'Cunningham Project', care of Sam Wagstaff. That project is now moving apace thanks to developments in technology, factorisation and primality testing. New levels of computer power and new computer architectures motivated by the open-ended promise of parallelism are now available. Once again, the suppliers may be offering free buildings with the computer. However, the Sandia '84 CRAY-l implementation of the quadratic-sieve method is now outpowered by the number-field sieve technique. This is deployed on either purpose-built hardware or large syndicates, even distributed world-wide, of collaborating standard processors. New factorisation techniques of both special and general applicability have been defined and deployed. The elliptic-curve method finds large factors with helpful properties while the number-field sieve approach is breaking down composites with over one hundred digits. The material is updated on an occasional basis to follow the latest developments in primality-testing large Mp and factorising smaller Mp; all dates derive from the published literature or referenced private communications. Minor corrections, additions and changes merely advance the issue number after the decimal point. The reader is invited to report any errors and omissions that have escaped the proof-reading, to answer the unresolved questions noted and to suggest additional material associated with this subject.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mostly because of a lack of observations, fundamental aspects of the St. Lawrence Estuary's wintertime response to forcing remain poorly understood. The results of a field campaign over the winter of 2002/03 in the estuary are presented. The response of the system to tidal forcing is assessed through the use of harmonic analyses of temperature, salinity, sea level, and current observations. The analyses confirm previous evidence for the presence of semidiurnal internal tides, albeit at greater depths than previously observed for ice-free months. The low-frequency tidal streams were found to be mostly baroclinic in character and to produce an important neap tide intensification of the estuarine circulation. Despite stronger atmospheric momentum forcing in winter, the response is found to be less coherent with the winds than seen in previous studies of ice-free months. The tidal residuals show the cold intermediate layer in the estuary is renewed rapidly ( 14 days) in late March by the advection of a wedge of near-freezing waters from the Gulf of St. Lawrence. In situ processes appeared to play a lesser role in the renewal of this layer. In particular, significant wintertime deepening of the estuarine surface mixed layer was prevented by surface stability, which remained high throughout the winter. The observations also suggest that the bottom circulation was intensified during winter, with the intrusion in the deep layer of relatively warm Atlantic waters, such that the 3 C isotherm rose from below 150 m to near 60 m.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accuracy and mesh generation are key issues for the high-resolution hydrodynamic modelling of the whole Great Barrier Reef. Our objective is to generate suitable unstructured grids that can resolve topological and dynamical features like tidal jets and recirculation eddies in the wake of islands. A new strategy is suggested to refine the mesh in areas of interest taking into account the bathymetric field and an approximated distance to islands and reefs. Such a distance is obtained by solving an elliptic differential operator, with specific boundary conditions. Meshes produced illustrate both the validity and the efficiency of the adaptive strategy. Selection of refinement and geometrical parameters is discussed. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study certain boundary value problems for the one-dimensional wave equation posed in a time-dependent domain. The approach we propose is based on a general transform method for solving boundary value problems for integrable nonlinear PDE in two variables, that has been applied extensively to the study of linear parabolic and elliptic equations. Here we analyse the wave equation as a simple illustrative example to discuss the particular features of this method in the context of linear hyperbolic PDEs, which have not been studied before in this framework.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Here we make an initial step toward the development of an ocean assimilation system that can constrain the modelled Atlantic Meridional Overturning Circulation (AMOC) to support climate predictions. A detailed comparison is presented of 1° and 1/4° resolution global model simulations with and without sequential data assimilation, to the observations and transport estimates from the RAPID mooring array across 26.5° N in the Atlantic. Comparisons of modelled water properties with the observations from the merged RAPID boundary arrays demonstrate the ability of in situ data assimilation to accurately constrain the east-west density gradient between these mooring arrays. However, the presence of an unconstrained "western boundary wedge" between Abaco Island and the RAPID mooring site WB2 (16 km offshore) leads to the intensification of an erroneous southwards flow in this region when in situ data are assimilated. The result is an overly intense southward upper mid-ocean transport (0–1100 m) as compared to the estimates derived from the RAPID array. Correction of upper layer zonal density gradients is found to compensate mostly for a weak subtropical gyre circulation in the free model run (i.e. with no assimilation). Despite the important changes to the density structure and transports in the upper layer imposed by the assimilation, very little change is found in the amplitude and sub-seasonal variability of the AMOC. This shows that assimilation of upper layer density information projects mainly on the gyre circulation with little effect on the AMOC at 26° N due to the absence of corrections to density gradients below 2000 m (the maximum depth of Argo). The sensitivity to initial conditions was explored through two additional experiments using a climatological initial condition. These experiments showed that the weak bias in gyre intensity in the control simulation (without data assimilation) develops over a period of about 6 months, but does so independently from the overturning, with no change to the AMOC. However, differences in the properties and volume transport of North Atlantic Deep Water (NADW) persisted throughout the 3 year simulations resulting in a difference of 3 Sv in AMOC intensity. The persistence of these dense water anomalies and their influence on the AMOC is promising for the development of decadal forecasting capabilities. The results suggest that the deeper waters must be accurately reproduced in order to constrain the AMOC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A parallel hardware random number generator for use with a VLSI genetic algorithm processing device is proposed. The design uses an systolic array of mixed congruential random number generators. The generators are constantly reseeded with the outputs of the proceeding generators to avoid significant biasing of the randomness of the array which would result in longer times for the algorithm to converge to a solution. 1 Introduction In recent years there has been a growing interest in developing hardware genetic algorithm devices [1, 2, 3]. A genetic algorithm (GA) is a stochastic search and optimization technique which attempts to capture the power of natural selection by evolving a population of candidate solutions by a process of selection and reproduction [4]. In keeping with the evolutionary analogy, the solutions are called chromosomes with each chromosome containing a number of genes. Chromosomes are commonly simple binary strings, the bits being the genes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Seasonal climate prediction offers the potential to anticipate variations in crop production early enough to adjust critical decisions. Until recently, interest in exploiting seasonal forecasts from dynamic climate models (e.g. general circulation models, GCMs) for applications that involve crop simulation models has been hampered by the difference in spatial and temporal scale of GCMs and crop models, and by the dynamic, nonlinear relationship between meteorological variables and crop response. Although GCMs simulate the atmosphere on a sub-daily time step, their coarse spatial resolution and resulting distortion of day-to-day variability limits the use of their daily output. Crop models have used daily GCM output with some success by either calibrating simulated yields or correcting the daily rainfall output of the GCM to approximate the statistical properties of historic observations. Stochastic weather generators are used to disaggregate seasonal forecasts either by adjusting input parameters in a manner that captures the predictable components of climate, or by constraining synthetic weather sequences to match predicted values. Predicting crop yields, simulated with historic weather data, as a statistical function of seasonal climatic predictors, eliminates the need for daily weather data conditioned on the forecast, but must often address poor statistical properties of the crop-climate relationship. Most of the work on using crop simulation with seasonal climate forecasts has employed historic analogs based on categorical ENSO indices. Other methods based on classification of predictors or weather types can provide daily weather inputs to crop models conditioned on forecasts. Advances in climate-based crop forecasting in the coming decade are likely to include more robust evaluation of the methods reviewed here, dynamically embedding crop models within climate models to account for crop influence on regional climate, enhanced use of remote sensing, and research in the emerging area of 'weather within climate'.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Seasonal climate prediction offers the potential to anticipate variations in crop production early enough to adjust critical decisions. Until recently, interest in exploiting seasonal forecasts from dynamic climate models (e.g. general circulation models, GCMs) for applications that involve crop simulation models has been hampered by the difference in spatial and temporal scale of GCMs and crop models, and by the dynamic, nonlinear relationship between meteorological variables and crop response. Although GCMs simulate the atmosphere on a sub-daily time step, their coarse spatial resolution and resulting distortion of day-to-day variability limits the use of their daily output. Crop models have used daily GCM output with some success by either calibrating simulated yields or correcting the daily rainfall output of the GCM to approximate the statistical properties of historic observations. Stochastic weather generators are used to disaggregate seasonal forecasts either by adjusting input parameters in a manner that captures the predictable components of climate, or by constraining synthetic weather sequences to match predicted values. Predicting crop yields, simulated with historic weather data, as a statistical function of seasonal climatic predictors, eliminates the need for daily weather data conditioned on the forecast, but must often address poor statistical properties of the crop-climate relationship. Most of the work on using crop simulation with seasonal climate forecasts has employed historic analogs based on categorical ENSO indices. Other methods based on classification of predictors or weather types can provide daily weather inputs to crop models conditioned on forecasts. Advances in climate-based crop forecasting in the coming decade are likely to include more robust evaluation of the methods reviewed here, dynamically embedding crop models within climate models to account for crop influence on regional climate, enhanced use of remote sensing, and research in the emerging area of 'weather within climate'.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Development research has responded to a number of charges over the past few decades. For example, when traditional research was accused of being 'top-down', the response was participatory research, linking the 'receptors' to the generators of research. As participatory processes were recognised as producing limited outcomes, the demand-led agenda was born. In response to the alleged failure of research to deliver its products, the 'joined-up' model, which links research with the private sector, has become popular. However, using examples from animal-health research, this article demonstrates that all the aforementioned approaches are seriously limited in their attempts to generate outputs to address the multi-faceted problems facing the poor. The article outlines a new approach to research: the Mosaic Model. By combining different knowledge forms, and focusing on existing gaps, the model aims to bridge basic and applied findings to enhance the efficiency and value of research, past, present, and future.