914 resultados para Accelerated failure time model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The glacial climate system transitioned rapidly between cold (stadial) and warm (interstadial) conditions in the Northern Hemisphere. This variability, referred to as Dansgaard-Oeschger variability, is widely believed to arise from perturbations of the Atlantic Meridional Overturning Circulation. Evidence for such changes during the longer Heinrich stadials has been identified, but direct evidence for overturning circulation changes during Dansgaard-Oeschger events has proven elusive. Here we reconstruct bottom water [CO3]2- variability from B/Ca ratios of benthic foraminifera and indicators of sedimentary dissolution, and use these reconstructions to infer the flow of northern-sourced deep water to the deep central sub-Antarctic Atlantic Ocean. We find that nearly every Dansgaard-Oeschger interstadial is accompanied by a rapid incursion of North Atlantic Deep Water into the deep South Atlantic. Based on these results and transient climate model simulations, we conclude that North Atlantic stadial-interstadial climate variability was associated with significant Atlantic overturning circulation changes that were rapidly transmitted across the Atlantic. However, by demonstrating the persistent role of Atlantic overturning circulation changes in past abrupt climate variability, our reconstructions of carbonate chemistry further indicate that the carbon cycle response to abrupt climate change was not a simple function of North Atlantic overturning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En el presente trabajo se desarrolla una metodología para caracterizar fallas activas como fuentes sísmicas independientes en combinación con zonas sismogenéticas tipo área de cara a la estimación probabilista poissoniana de la peligrosidad sísmica. Esta metodología está basada en el reparto de la tasa de momento sísmico registrada en una región entre las fuentes potencialmente activas subyacentes (fallas activas modelizadas de forma independiente y una zonificación sismogenética), haciendo especial hincapié en regiones de sismicidad moderada y fallas de lento movimiento. Se desarrolla una aplicación de la metodología en el sureste de España, incorporando al cálculo 106 fuentes sísmicas independientes: 95 de tipo falla (catalogadas como fallas activas en la base de datos QAFI) y 11 zonas sismogenéticas de tipo área. Del mismo modo, se estima la peligrosidad sísmica con el método clásico zonificado y se comparan los resultados, analizando la influencia de la inclusión de las fallas de forma independiente en la estimación de la peligrosidad. Por último, se desarrolla una aplicación de la metodología propuesta en la estimación de la peligrosidad sísmica considerando un modelo temporal no poissoniano. La aplicación se centra en la falla de Carboneras, mostrando la repercusión que puede tener este cambio de modelo temporal en la estimación final de la peligrosidad. ABSTRACT A new methodology of seismic source characterization to be included in poissonian, probabilistic seismic hazard assessments, is developed in this work. Active faults are considered as independent seismogenic sources in combination with seismogenic area sources. This methodology is based in the distribution of the seismic moment rate recorded in a region between the potentially active underlying seismic sources that it contains (active faults modeled independently and an area-source seismic model), with special emphasis on regions with moderate seismicity and faults with slow deformation rates. An application of the methodology is carried out in the southeastern part of Spain, incorporating 106 independent seismic sources in the computations: 95 of fault type (catalogued as active faults in the Quaternary Active Fault Database, QAFI) and 11 of area-source type. At the same time, the seismic hazard is estimated following the classical area-source method. The results obtained using both methodologies (the classical one and the one proposed in this work9 are compared, analyzing the influence of the inclusion of faults as independent sources in hazard estimates. Finally, an application of the proposed methodology considering a non-poissonian time model is shown. This application is carried out in the Carboneras fault and shows the repercussion that this change of time model has in the final hazard estimates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Studies of initial activities of carbon monoxide dehydrogenase (CODH) from Rhodospirillum rubrum show that CODH is mostly inactive at redox potentials higher than −300 mV. Initial activities measured at a wide range of redox potentials (0–500 mV) fit a function corresponding to the Nernst equation with a midpoint potential of −316 mV. Previously, extensive EPR studies of CODH have suggested that CODH has three distinct redox states: (i) a spin-coupled state at −60 to −300 mV that gives rise to an EPR signal termed Cred1; (ii) uncoupled states at <−320 mV in the absence of CO2 referred to as Cunc; and (iii) another spin-coupled state at <−320 mV in the presence of CO2 that gives rise to an EPR signal termed Cred2B. Because there is no initial CODH activity at potentials that give rise to Cred1, the state (Cred1) is not involved in the catalytic mechanism of this enzyme. At potentials more positive than −380 mV, CODH recovers its full activity over time when incubated with CO. This reductant-dependent conversion of CODH from an inactive to an active form is referred to hereafter as “autocatalysis.” Analyses of the autocatalytic activation process of CODH suggest that the autocatalysis is initiated by a small fraction of activated CODH; the small fraction of active CODH catalyzes CO oxidation and consequently lowers the redox potential of the assay system. This process is accelerated with time because of accumulation of the active enzyme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We explore the role of business services in knowledge accumulation and growth and the determinants of knowledge diffusion including the role of distance. A continuous time model is estimated on several European countries, Japan, and the US. Policy simulations illustrate the benefits for EU growth of the deepening of the single market, the reduction of regulatory barriers, and the accumulation of technology and human capital. Our results support the basic insights of the Lisbon Agenda. Economic growth in Europe is enhanced to the extent that: trade in services increases, technology accumulation and diffusion increase, regulation becomes both less intensive and more uniform across countries, and human capital accumulation increases in all countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measurements of Fe(II) and H2O2 were carried out in the Atlantic sector of the Southern Ocean during EisenEx, an iron enrichment experiment. Iron was added on three separate occasions, approximately every 8 days, as a ferrous sulfate (FeSO4) solution. Vertical profiles of Fe(II) showed maxima consistent with the plume of the iron infusion. While H2O2 profiles revealed a corresponding minima showing the effect of oxidation of Fe(II) by H2O2, observations showed detectable Fe(II) concentrations existed for up to 8 days after an iron infusion. H2O2 concentrations increased at the depth of the chlorophyll maximum when iron concentrations returned to pre-infusion concentrations (<80 pM) possibly due to biological production related to iron reductase activity. In this work, Fe(II) and dissolved iron were used as tracers themselves for subsequent iron infusions when no further SF6 was added. EisenEx was subject to periods of weak and strong mixing. Slow mixing after the second infusion allowed significant concentrations of Fe(II) and Fe to exist for several days. During this time, dissolved and total iron in the infusion plume behaved almost conservatively as it was trapped between a relict mixed layer and a new rain-induced mixed layer. Using dissolved iron, a value for the vertical diffusion coefficient Kz=6.7±0.7 cm**2/s was obtained for this 2-day period. During a subsequent surface survey of the iron-enriched patch, elevated levels of Fe(II) were found in surface waters presumably from Fe(II) dissolved in the rainwater that was falling at this time. Model results suggest that the reaction between uncomplexed Fe(III) and O2? was a significant source of Fe(II) during EisenEx and helped to maintain high levels of Fe(II) in the water column. This phenomenon may occur in iron enrichment experiments when two conditions are met: (i) When Fe is added to a system already saturated with regard to organic complexation and (ii) when mixing processes are slow, thereby reducing the dispersion of iron into under-saturated waters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present time.--Model prison.--Downing Street.--The new Downing Street.--Stump-orator.--Parliaments.--Hudson's statue.--Jesuitism

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Application of neural network algorithm for increasing the accuracy of navigation systems are showing. Various navigation systems, where a couple of sensors are used in the same device in different positions and the disturbances act equally on both sensors, the trained neural network can be advantageous for increasing the accuracy of system. The neural algorithm had used for determination the interconnection between the sensors errors in two channels to avoid the unobservation of navigation system. Representation of thermal error of two- component navigation sensors by time model, which coefficients depend only on parameters of the device, its orientations relative to disturbance vector allows to predict thermal errors change, measuring the current temperature and having identified preliminary parameters of the model for the set position. These properties of thermal model are used for training the neural network and compensation the errors of navigation system in non- stationary thermal fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climatic changes cause alterations in circulation patterns of the world oceans. The highly saline Mediterranean Outflow Water (MOW), built within the Mediterranean Sea crosses the Strait of Gibraltar in westerly directions, turning north-westward to stick to the Iberian Slope within 600-1500m water depths. Circulation pattern and current speed of the MOW are strongly influenced by climatically induced variations and thus control sedimentation processes along the South- and West - Iberian Continental Slope. Sedimentation characteristics of the investigated area are therefore suitable to reconstruct temporal hydrodynamic changes of the MOW. Detailed investigations on the silt-sized grain distribution, physical properties and hydroacoustic data were performed to recalculate paleo-current-velocities and to understand the sedimentation history in the Golf of Cadiz and the Portuguese Continental Slope. A time model based on d18Odata and 14C-datings of planktic foraminifera allowed the stratigraphical classification of the core material and thus the dating of the current induced sediment layers showing the variations of paleo-current intensities. The evaluation and interpretation of the gathered data sets enabled us to reconstruct lateral and temporal sedimentation patterns of the MOW for the Holocene and the late Pleistocene, back to the Last Glacial Maximum (LGM).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Motivated by new and innovative rental business models, this paper develops a novel discrete-time model of a rental operation with random loss of inventory due to customer use. The inventory level is chosen before the start of a finite rental season, and customers not immediately served are lost. Our analysis framework uses stochastic comparisons of sample paths to derive structural results that hold under good generality for demands, rental durations, and rental unit lifetimes. Considering different \recirculation" rules | i.e., which rental unit to choose to meet each demand | we prove the concavity of the expected profit function and identify the optimal recirculation rule. A numerical study clarifies when considering rental unit loss and recirculation rules matters most for the inventory decision: Accounting for rental unit loss can increase the expected profit by 7% for a single season and becomes even more important as the time horizon lengthens. We also observe that the optimal inventory level in response to increasing loss probability is non-monotonic. Finally, we show that choosing the optimal recirculation rule over another simple policy allows more rental units to be profitably added, and the profit-maximizing service level increases by up to 6 percentage points.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This presentation was both an illustrated lecture and a published paper presented at the IMPACT 9 Conference Printmaking in the Post-Print Age, Hangzhou China 2015. It was an extension of the exhibition catalogue essay for the Bluecoat Gallery Exhibition of the same name. In 2014 I curated an exhibition The Negligent Eye at the Bluecoat Gallery in Liverpool as the result of longstanding interest in scanning and 3D printing and the role of these in changing the field of Print within Fine Art Practice. In the aftermath of curatingshow I have continued to reflect on this material with reference to the writings of Vilém Flusser and Hito Steyerl. The work in the exhibition came from a wide range of artists of all generations most of whom are not explicitly located within Printmaking. Whilst some work did not use any scanning technology at all, a shared fascination with the particular translating device of the systematizing ‘eye’ of a scanning digital video camera, flatbed or medical scanner was expressed by all the work in the show. Through writing this paper I aim to extend my own understanding of questions, which arose from the juxtapositions of work and the production of the accompanying catalogue. The show developed in dialogue with curators Bryan Biggs and Sarah-Jane Parsons of the Bluecoat Gallery who sent a series of questions about scanning to participating artists. In reflecting upon their answers I will extend the discussions begun in the process of this research. A kind of created attention deficit disorder seems to operate on us all today to make and distribute images and information at speed. What value do ways of making which require slow looking or intensive material explorations have in this accelerated system? What model of the world is being constructed by the drive to simulated realities toward ever-greater resolution, so called high definition? How are our perceptions of reality being altered by the world-view presented in the smooth colourful ever morphing simulations that surround us? The limitations of digital technology are often a starting point for artists to reflect on our relationship to real-world fragility. I will be looking at practices where tactility or dimensionality in a form of hard copy engages with these questions using examples from the exhibition. Artists included in the show were: Cory Arcangel, Christiane Baumgartner, Thomas Bewick, Jyll Bradley, Maurice Carlin, Helen Chadwick, Susan Collins, Conroy/Sanderson, Nicky Coutts, Elizabeth Gossling, Beatrice Haines, Juneau Projects, Laura Maloney, Bob Matthews, London Fieldworks (with the participation of Gustav Metzger), Marilène Oliver, Flora Parrott, South Atlantic Souvenirs, Imogen Stidworthy, Jo Stockham, Wolfgang Tillmans, Alessa Tinne, Michael Wegerer, Rachel Whiteread, Jane and Louise Wilson. Scanning, Art, Technology, Copy, Materiality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents quantitative studies of T cell and dendritic cell (DC) behaviour in mouse lymph nodes (LNs) in the naive state and following immunisation. These processes are of importance and interest in basic immunology, and better understanding could improve both diagnostic capacity and therapeutic manipulations, potentially helping in producing more effective vaccines or developing treatments for autoimmune diseases. The problem is also interesting conceptually as it is relevant to other fields where 3D movement of objects is tracked with a discrete scanning interval. A general immunology introduction is presented in chapter 1. In chapter 2, I apply quantitative methods to multi-photon imaging data to measure how T cells and DCs are spatially arranged in LNs. This has been previously studied to describe differences between the naive and immunised state and as an indicator of the magnitude of the immune response in LNs, but previous analyses have been generally descriptive. The quantitative analysis shows that some of the previous conclusions may have been premature. In chapter 3, I use Bayesian state-space models to test some hypotheses about the mode of T cell search for DCs. A two-state mode of movement where T cells can be classified as either interacting to a DC or freely migrating is supported over a model where T cells would home in on DCs at distance through for example the action of chemokines. In chapter 4, I study whether T cell migration is linked to the geometric structure of the fibroblast reticular network (FRC). I find support for the hypothesis that the movement is constrained to the fibroblast reticular cell (FRC) network over an alternative 'random walk with persistence time' model where cells would move randomly, with a short-term persistence driven by a hypothetical T cell intrinsic 'clock'. I also present unexpected results on the FRC network geometry. Finally, a quantitative method is presented for addressing some measurement biases inherent to multi-photon imaging. In all three chapters, novel findings are made, and the methods developed have the potential for further use to address important problems in the field. In chapter 5, I present a summary and synthesis of results from chapters 3-4 and a more speculative discussion of these results and potential future directions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Atualmente, as organizações tendem a desenvolverem-se com o objetivo de se tornarem mais eficazes e eficientes. Neste contexto, esta investigação visa propor um modelo que permita calcular os Custos da Qualidade (CQ) na manutenção e sustentação dos Sistemas de Armas da Força Aérea (FA), contribuindo para a melhoria contínua do Sistema de Gestão da Qualidade e Aeronavegabilidade (SGQA). Assim, neste estudo é avaliada a utilização do modelo “Prevenção, Avaliação e Falhas” (PAF) para o cálculo dos CQ no SGQA, a forma como os Sistemas de Informação (SI) podem contribuir para este cálculo e qual a estrutura do sistema que deverá integrar e operacionalizar este modelo. Esta investigação desenvolve-se mediante um raciocínio hipotético-dedutivo, utilizando uma estratégia qualitativa aplicada num estudo de caso ao SA Epsilon Tb-30. Após apresentar um enquadramento teórico, são testadas as hipóteses identificadas através de análise documental e entrevistas a elementos com funções-chave neste âmbito. Verifica-se então a possibilidade de utilizar o modelo PAF para o cálculo dos CQ no SGQA. Contudo, é necessário adaptar os SI e os processos do sistema para a sua operacionalização. Finalmente, é proposto um plano para implementação do modelo de CQ, assim como são apresentadas algumas recomendações para o seu desenvolvimento. Abstract: Nowadays, the organizations tend to self-develop in order to increase their efficiency and effectiveness. In this context, this study has the purpose to propose a Quality Cost (CQ) model within the scope of maintenance and sustainability of Portuguese Air Force (FA) weapon systems, contributing to the continuous improvement of its Airworthiness and Quality Management System (SGQA). Therefore, throughout this study is evaluated the implementation of Prevention, Appraisal and Failure (PAF) model for CQ calculation, how the Information Systems (SI) can contribute for this calculus and what SGQA structure should integrate and operationalize this model. In this investigation is used a hypothetical-deductive reasoning, through a qualitative strategy applied to a case study in Epsilon TB-30 aircraft. After presenting an initial theoretical study, the raised hypotheses are tested through the relevant document analysis and interviews with elements in key functions within this scope. With this study it’s shown the possibility to use PAF model to calculate CQ of the SGQA. However, it’s necessary to adapt the SI and the system processes to get the operationalization of this model. Finally, an implementation plan of the evaluated CQ model is proposed, and some recommendations are made for its future development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Doutoramento em Matemática

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An effective prognostics program will provide ample lead time for maintenance engineers to schedule a repair and to acquire replacement components before catastrophic failures occur. This paper presents a technique for accurate assessment of the remnant life of machines based on health state probability estimation technique. For comparative study of the proposed model with the proportional hazard model (PHM), experimental bearing failure data from an accelerated bearing test rig were used. The result shows that the proposed prognostic model based on health state probability estimation can provide a more accurate prediction capability than the commonly used PHM in bearing failure case study.