948 resultados para diffusive viscoelastic model, global weak solution, error estimate


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a method that robustly combines color and feature buffers to denoise Monte Carlo renderings. On one hand, feature buffers, such as per pixel normals, textures, or depth, are effective in determining denoising filters because features are highly correlated with rendered images. Filters based solely on features, however, are prone to blurring image details that are not well represented by the features. On the other hand, color buffers represent all details, but they may be less effective to determine filters because they are contaminated by the noise that is supposed to be removed. We propose to obtain filters using a combination of color and feature buffers in an NL-means and cross-bilateral filtering framework. We determine a robust weighting of colors and features using a SURE-based error estimate. We show significant improvements in subjective and quantitative errors compared to the previous state-of-the-art. We also demonstrate adaptive sampling and space-time filtering for animations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to elucidate the impact of changes in solar irradiance and energetic particles versus volcanic eruptions on tropospheric global climate during the Dalton Minimum (DM, AD 1780–1840). Separate variations in the (i) solar irradiance in the UV-C with wavelengths λ < 250 nm, (ii) irradiance at wavelengths λ > 250 nm, (iii) in energetic particle spectrum, and (iv) volcanic aerosol forcing were analyzed separately, and (v) in combination, by means of small ensemble calculations using a coupled atmosphere–ocean chemistry–climate model. Global and hemispheric mean surface temperatures show a significant dependence on solar irradiance at λ > 250 nm. Also, powerful volcanic eruptions in 1809, 1815, 1831 and 1835 significantly decreased global mean temperature by up to 0.5 K for 2–3 years after the eruption. However, while the volcanic effect is clearly discernible in the Southern Hemispheric mean temperature, it is less significant in the Northern Hemisphere, partly because the two largest volcanic eruptions occurred in the SH tropics and during seasons when the aerosols were mainly transported southward, partly because of the higher northern internal variability. In the simulation including all forcings, temperatures are in reasonable agreement with the tree ring-based temperature anomalies of the Northern Hemisphere. Interestingly, the model suggests that solar irradiance changes at λ < 250 nm and in energetic particle spectra have only an insignificant impact on the climate during the Dalton Minimum. This downscales the importance of top–down processes (stemming from changes at λ < 250 nm) relative to bottom–up processes (from λ > 250 nm). Reduction of irradiance at λ > 250 nm leads to a significant (up to 2%) decrease in the ocean heat content (OHC) between 0 and 300 m in depth, whereas the changes in irradiance at λ < 250 nm or in energetic particles have virtually no effect. Also, volcanic aerosol yields a very strong response, reducing the OHC of the upper ocean by up to 1.5%. In the simulation with all forcings, the OHC of the uppermost levels recovers after 8–15 years after volcanic eruption, while the solar signal and the different volcanic eruptions dominate the OHC changes in the deeper ocean and prevent its recovery during the DM. Finally, the simulations suggest that the volcanic eruptions during the DM had a significant impact on the precipitation patterns caused by a widening of the Hadley cell and a shift in the intertropical convergence zone.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Estimates of the size of the undiagnosed HIV-infected population are important to understand the HIV epidemic and to plan interventions, including "test-and-treat" strategies. METHODS We developed a multi-state back-calculation model to estimate HIV incidence, time between infection and diagnosis, and the undiagnosed population by CD4 count strata, using surveillance data on new HIV and AIDS diagnoses. The HIV incidence curve was modelled using cubic splines. The model was tested on simulated data and applied to surveillance data on men who have sex with men in The Netherlands. RESULTS The number of HIV infections could be estimated accurately using simulated data, with most values within the 95% confidence intervals of model predictions. When applying the model to Dutch surveillance data, 15,400 (95% confidence interval [CI] = 15,000, 16,000) men who have sex with men were estimated to have been infected between 1980 and 2011. HIV incidence showed a bimodal distribution, with peaks around 1985 and 2005 and a decline in recent years. Mean time to diagnosis was 6.1 (95% CI = 5.8, 6.4) years between 1984 and 1995 and decreased to 2.6 (2.3, 3.0) years in 2011. By the end of 2011, 11,500 (11,000, 12,000) men who have sex with men in The Netherlands were estimated to be living with HIV, of whom 1,750 (1,450, 2,200) were still undiagnosed. Of the undiagnosed men who have sex with men, 29% (22, 37) were infected for less than 1 year, and 16% (13, 20) for more than 5 years. CONCLUSIONS This multi-state back-calculation model will be useful to estimate HIV incidence, time to diagnosis, and the undiagnosed HIV epidemic based on routine surveillance data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical methods are developed which assess survival data for two attributes; (1) prolongation of life, (2) quality of life. Health state transition probabilities correspond to prolongation of life and are modeled as a discrete-time semi-Markov process. Imbedded within the sojourn time of a particular health state are the quality of life transitions. They reflect events which differentiate perceptions of pain and suffering over a fixed time period. Quality of life transition probabilities are derived from the assumptions of a simple Markov process. These probabilities depend on the health state currently occupied and the next health state to which a transition is made. Utilizing the two forms of attributes the model has the capability to estimate the distribution of expected quality adjusted life years (in addition to the distribution of expected survival times). The expected quality of life can also be estimated within the health state sojourn time making more flexible the assessment of utility preferences. The methods are demonstrated on a subset of follow-up data from the Beta Blocker Heart Attack Trial (BHAT). This model contains the structure necessary to make inferences when assessing a general survival problem with a two dimensional outcome. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A high-resolution stratigraphy is essential toward deciphering climate variability in detail and understanding causality arguments of events in earth history. Because the highly dynamic middle to late Eocene provides a suitable testing ground for carbon cycle models for a waning warm world, an accurate time scale is needed to decode climate-driving mechanisms. Here we present new results from ODP Site 1260 (Leg 207) which covers a unique expanded middle Eocene section (magnetochrons C18r to C20r, late Lutetian to early Bartonian) of the tropical western Atlantic including the chron C19r transient hyperthermal event and the Middle Eocene Climate Optimum (MECO). To establish a detailed cyclostratigraphy we acquired a distinctive iron intensity records by XRF scanning Site 1260 cores. We revise the shipboard composite section, establish a cyclostratigraphy and use the exceptional eccentricity modulated precession cycles for orbital tuning. The new astrochronology revises the age of magnetic polarity chrons C19n to C20n, validates the position of very long eccentricity minima at 40.2 and 43.0 Ma in the orbital solutions, and extends the Astronomically Tuned Geological Time Scale back to 44 Ma. For the first time the new data provide clear evidence for an orbital pacing of the chron C19r event and a likely involvement of the very long eccentricity cycle contributing to the evolution of the MECO.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Secchi depth is a measure of water transparency. In the Baltic Sea region, Secchi depth maps are used to assess eutrophication and as input for habitat models. Due to their spatial and temporal coverage, satellite data would be the most suitable data source for such maps. But the Baltic Sea's optical properties are so different from the open ocean that globally calibrated standard models suffer from large errors. Regional predictive models that take the Baltic Sea's special optical properties into account are thus needed. This paper tests how accurately generalized linear models (GLMs) and generalized additive models (GAMs) with MODIS/Aqua and auxiliary data as inputs can predict Secchi depth at a regional scale. It uses cross-validation to test the prediction accuracy of hundreds of GAMs and GLMs with up to 5 input variables. A GAM with 3 input variables (chlorophyll a, remote sensing reflectance at 678 nm, and long-term mean salinity) made the most accurate predictions. Tested against field observations not used for model selection and calibration, the best model's mean absolute error (MAE) for daily predictions was 1.07 m (22%), more than 50% lower than for other publicly available Baltic Sea Secchi depth maps. The MAE for predicting monthly averages was 0.86 m (15%). Thus, the proposed model selection process was able to find a regional model with good prediction accuracy. It could be useful to find predictive models for environmental variables other than Secchi depth, using data from other satellite sensors, and for other regions where non-standard remote sensing models are needed for prediction and mapping. Annual and monthly mean Secchi depth maps for 2003-2012 come with this paper as Supplementary materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Micro-crystalline barites recovered by deep-sea drilling from Site 684 on the Peru margin and Site 799 in the Japan Sea are highly enriched in the heavy sulfur isotope relative to seawater ( d34S up to +84?). This isotopic composition is consistent with remobilization of biogenic barite triggered by sulfate reduction, and subsequent reprecipitation as a diagenetic barite front. The high levels of barium sulfate in these deposits (10-50%) cannot be explained by a diffusive transport model in sediments experiencing a constant rate of sedimentation. When sedimentation rates change radically, the barite front will remain at a given depth interval leading to large accumulations of barium sulfate. Such conditions may have generated the barite deposits at Site 799. At Site 684, on the other hand, there is evidence that the barite deposits are a result of the tectonically-driven advection of sulfate-bearing fluids through the sediment column.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distributed real-time embedded systems are becoming increasingly important to society. More demands will be made on them and greater reliance will be placed on the delivery of their services. A relevant subset of them is high-integrity or hard real-time systems, where failure can cause loss of life, environmental harm, or significant financial loss. Additionally, the evolution of communication networks and paradigms as well as the necessity of demanding processing power and fault tolerance, motivated the interconnection between electronic devices; many of the communications have the possibility of transferring data at a high speed. The concept of distributed systems emerged as systems where different parts are executed on several nodes that interact with each other via a communication network. Java’s popularity, facilities and platform independence have made it an interesting language for the real-time and embedded community. This was the motivation for the development of RTSJ (Real-Time Specification for Java), which is a language extension intended to allow the development of real-time systems. The use of Java in the development of high-integrity systems requires strict development and testing techniques. However, RTJS includes a number of language features that are forbidden in such systems. In the context of the HIJA project, the HRTJ (Hard Real-Time Java) profile was developed to define a robust subset of the language that is amenable to static analysis for high-integrity system certification. Currently, a specification under the Java community process (JSR- 302) is being developed. Its purpose is to define those capabilities needed to create safety critical applications with Java technology called Safety Critical Java (SCJ). However, neither RTSJ nor its profiles provide facilities to develop distributed realtime applications. This is an important issue, as most of the current and future systems will be distributed. The Distributed RTSJ (DRTSJ) Expert Group was created under the Java community process (JSR-50) in order to define appropriate abstractions to overcome this problem. Currently there is no formal specification. The aim of this thesis is to develop a communication middleware that is suitable for the development of distributed hard real-time systems in Java, based on the integration between the RMI (Remote Method Invocation) model and the HRTJ profile. It has been designed and implemented keeping in mind the main requirements such as the predictability and reliability in the timing behavior and the resource usage. iThe design starts with the definition of a computational model which identifies among other things: the communication model, most appropriate underlying network protocols, the analysis model, and a subset of Java for hard real-time systems. In the design, the remote references are the basic means for building distributed applications which are associated with all non-functional parameters and resources needed to implement synchronous or asynchronous remote invocations with real-time attributes. The proposed middleware separates the resource allocation from the execution itself by defining two phases and a specific threading mechanism that guarantees a suitable timing behavior. It also includes mechanisms to monitor the functional and the timing behavior. It provides independence from network protocol defining a network interface and modules. The JRMP protocol was modified to include two phases, non-functional parameters, and message size optimizations. Although serialization is one of the fundamental operations to ensure proper data transmission, current implementations are not suitable for hard real-time systems and there are no alternatives. This thesis proposes a predictable serialization that introduces a new compiler to generate optimized code according to the computational model. The proposed solution has the advantage of allowing us to schedule the communications and to adjust the memory usage at compilation time. In order to validate the design and the implementation a demanding validation process was carried out with emphasis in the functional behavior, the memory usage, the processor usage (the end-to-end response time and the response time in each functional block) and the network usage (real consumption according to the calculated consumption). The results obtained in an industrial application developed by Thales Avionics (a Flight Management System) and in exhaustive tests show that the design and the prototype are reliable for industrial applications with strict timing requirements. Los sistemas empotrados y distribuidos de tiempo real son cada vez más importantes para la sociedad. Su demanda aumenta y cada vez más dependemos de los servicios que proporcionan. Los sistemas de alta integridad constituyen un subconjunto de gran importancia. Se caracterizan por que un fallo en su funcionamiento puede causar pérdida de vidas humanas, daños en el medio ambiente o cuantiosas pérdidas económicas. La necesidad de satisfacer requisitos temporales estrictos, hace más complejo su desarrollo. Mientras que los sistemas empotrados se sigan expandiendo en nuestra sociedad, es necesario garantizar un coste de desarrollo ajustado mediante el uso técnicas adecuadas en su diseño, mantenimiento y certificación. En concreto, se requiere una tecnología flexible e independiente del hardware. La evolución de las redes y paradigmas de comunicación, así como la necesidad de mayor potencia de cómputo y de tolerancia a fallos, ha motivado la interconexión de dispositivos electrónicos. Los mecanismos de comunicación permiten la transferencia de datos con alta velocidad de transmisión. En este contexto, el concepto de sistema distribuido ha emergido como sistemas donde sus componentes se ejecutan en varios nodos en paralelo y que interactúan entre ellos mediante redes de comunicaciones. Un concepto interesante son los sistemas de tiempo real neutrales respecto a la plataforma de ejecución. Se caracterizan por la falta de conocimiento de esta plataforma durante su diseño. Esta propiedad es relevante, por que conviene que se ejecuten en la mayor variedad de arquitecturas, tienen una vida media mayor de diez anos y el lugar ˜ donde se ejecutan puede variar. El lenguaje de programación Java es una buena base para el desarrollo de este tipo de sistemas. Por este motivo se ha creado RTSJ (Real-Time Specification for Java), que es una extensión del lenguaje para permitir el desarrollo de sistemas de tiempo real. Sin embargo, RTSJ no proporciona facilidades para el desarrollo de aplicaciones distribuidas de tiempo real. Es una limitación importante dado que la mayoría de los actuales y futuros sistemas serán distribuidos. El grupo DRTSJ (DistributedRTSJ) fue creado bajo el proceso de la comunidad de Java (JSR-50) con el fin de definir las abstracciones que aborden dicha limitación, pero en la actualidad aun no existe una especificacion formal. El objetivo de esta tesis es desarrollar un middleware de comunicaciones para el desarrollo de sistemas distribuidos de tiempo real en Java, basado en la integración entre el modelo de RMI (Remote Method Invocation) y el perfil HRTJ. Ha sido diseñado e implementado teniendo en cuenta los requisitos principales, como la predecibilidad y la confiabilidad del comportamiento temporal y el uso de recursos. El diseño parte de la definición de un modelo computacional el cual identifica entre otras cosas: el modelo de comunicaciones, los protocolos de red subyacentes más adecuados, el modelo de análisis, y un subconjunto de Java para sistemas de tiempo real crítico. En el diseño, las referencias remotas son el medio básico para construcción de aplicaciones distribuidas las cuales son asociadas a todos los parámetros no funcionales y los recursos necesarios para la ejecución de invocaciones remotas síncronas o asíncronas con atributos de tiempo real. El middleware propuesto separa la asignación de recursos de la propia ejecución definiendo dos fases y un mecanismo de hebras especifico que garantiza un comportamiento temporal adecuado. Además se ha incluido mecanismos para supervisar el comportamiento funcional y temporal. Se ha buscado independencia del protocolo de red definiendo una interfaz de red y módulos específicos. También se ha modificado el protocolo JRMP para incluir diferentes fases, parámetros no funcionales y optimizaciones de los tamaños de los mensajes. Aunque la serialización es una de las operaciones fundamentales para asegurar la adecuada transmisión de datos, las actuales implementaciones no son adecuadas para sistemas críticos y no hay alternativas. Este trabajo propone una serialización predecible que ha implicado el desarrollo de un nuevo compilador para la generación de código optimizado acorde al modelo computacional. La solución propuesta tiene la ventaja que en tiempo de compilación nos permite planificar las comunicaciones y ajustar el uso de memoria. Con el objetivo de validar el diseño e implementación se ha llevado a cabo un exigente proceso de validación con énfasis en: el comportamiento funcional, el uso de memoria, el uso del procesador (tiempo de respuesta de extremo a extremo y en cada uno de los bloques funcionales) y el uso de la red (consumo real conforme al estimado). Los buenos resultados obtenidos en una aplicación industrial desarrollada por Thales Avionics (un sistema de gestión de vuelo) y en las pruebas exhaustivas han demostrado que el diseño y el prototipo son fiables para aplicaciones industriales con estrictos requisitos temporales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A sustainable manufacturing process must rely on an also sustainable raw materials and energy supply. This paper is intended to show the results of the studies developed on sustainable business models for the minerals industry as a fundamental previous part of a sustainable manufacturing process. As it has happened in other economic activities, the mining and minerals industry has come under tremendous pressure to improve its social, developmental, and environmental performance. Mining, refining, and the use and disposal of minerals have in some instances led to significant local environmental and social damage. Nowadays, like in other parts of the corporate world, companies are more routinely expected to perform to ever higher standards of behavior, going well beyond achieving the best rate of return for shareholders. They are also increasingly being asked to be more transparent and subject to third-party audit or review, especially in environmental aspects. In terms of environment, there are three inter-related areas where innovation and new business models can make the biggest difference: carbon, water and biodiversity. The focus in these three areas is for two reasons. First, the industrial and energetic minerals industry has significant footprints in each of these areas. Second, these three areas are where the potential environmental impacts go beyond local stakeholders and communities, and can even have global impacts, like in the case of carbon. So prioritizing efforts in these areas will ultimately be a strategic differentiator as the industry businesses continues to grow. Over the next forty years, world?s population is predicted to rise from 6.300 million to 9.500 million people. This will mean a huge demand of natural resources. Indeed, consumption rates are such that current demand for raw materials will probably soon exceed the planet?s capacity. As awareness of the actual situation grows, the public is demanding goods and services that are even more environmentally sustainable. This means that massive efforts are required to reduce the amount of materials we use, including freshwater, minerals and oil, biodiversity, and marine resources. It?s clear that business as usual is no longer possible. Today, companies face not only the economic fallout of the financial crisis; they face the substantial challenge of transitioning to a low-carbon economy that is constrained by dwindling natural resources easily accessible. Innovative business models offer pioneering companies an early start toward the future. They can signal to consumers how to make sustainable choices and provide reward for both the consumer and the shareholder. Climate change and carbon remain major risk discontinuities that we need to better understand and deal with. In the absence of a global carbon solution, the principal objective of any individual country should be to reduce its global carbon emissions by encouraging conservation. The mineral industry internal response is to continue to focus on reducing the energy intensity of our existing operations through energy efficiency and the progressive introduction of new technology. Planning of the new projects must ensure that their energy footprint is minimal from the start. These actions will increase the long term resilience of the business to uncertain energy and carbon markets. This focus, combined with a strong demand for skills in this strategic area for the future requires an appropriate change in initial and continuing training of engineers and technicians and their awareness of the issue of eco-design. It will also need the development of measurement tools for consistent comparisons between companies and the assessments integration of the carbon footprint of mining equipments and services in a comprehensive impact study on the sustainable development of the Economy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In tunnel construction, as in every engineering work, it is usual the decision making, with incomplete data. Nevertheless, consciously or not, the builder weighs the risks (even if this is done subjectively) so that he can offer a cost. The objective of this paper is to recall the existence of a methodology to treat the uncertainties in the data so that it is possible to see their effect on the output of the computational model used and then to estimate the failure probability or the safety margin of a structure. In this scheme it is possible to include the subjective knowledge on the statistical properties of the random variables and, using a numerical model consistent with the degree of complexity appropiate to the problem at hand, to make rationally based decisions. As will be shown with the method it is possible to quantify the relative importance of the random variables and, in addition, it can be used, under certain conditions, to solve the inverse problem. It is then a method very well suited both to the project and to the control phases of tunnel construction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Las centrales nucleares necesitan de personal altamente especializado y formado. Es por ello por lo que el sector de la formación especializada en centrales nucleares necesita incorporar los últimos avances en métodos formativos. Existe una gran cantidad de cursos de formación presenciales y es necesario transformar dichos cursos para utilizarlos con las nuevas tecnologías de la información. Para ello se necesitan equipos multidisciplinares, en los que se incluyen ingenieros, que deben identificar los objetivos formativos, competencias, contenidos y el control de calidad del propio curso. En este proyecto se utilizan técnicas de ingeniería del conocimiento como eje metodológico para transformar un curso de formación presencial en formación on-line a través de tecnologías de la información. En la actualidad, las nuevas tecnologías de la información y comunicación están en constante evolución. De esta forma se han sumergido en el mundo transformando la visión que teníamos de éste para dar lugar a nuevas oportunidades. Es por ello que este proyecto busca la unión entre el e-learning y el mundo empresarial. El objetivo es el diseño, en plataforma e-learning, de un curso técnico que instruya a operadores de sala de control de una central nuclear. El trabajo realizado en este proyecto ha sido, además de transformar un curso presencial en on-line, en obtener una metodología para que otros cursos se puedan transformar. Para conseguir este cometido, debemos preocuparnos tanto por el contenido de los cursos como por su gestión. Por este motivo, el proyecto comienza con definiciones básicas de terminología propia de e-learning. Continúa con la generación de una metodología que aplique la gestión de conocimiento para transformar cualquier curso presencial a esta plataforma. Definida la metodología, se aplicará para el diseño del curso específico de Coeficientes Inherentes de Reactividad. Finaliza con un estudio económico que dé viabilidad al proyecto y con la creación de un modelo económico que estime el precio para cualquier curso futuro. Abstract Nuclear power plants need highly specialized and trained personnel. Thus, nuclear power plant Specialized Training Sector requires the incorporation of the latest advances in training methods. A large array of face-to-face training courses exist and it has become necessary to transform said courses in order to apply them with the new information systems available. For this, multidisciplinary equipment is needed where the engineering workforce must identify educational objectives, competences and abilities, contents and quality control of the different courses. In this project, knowledge engineering techniques are employed as the methodological axis in order to transform a face-to-face training course into on-line training through the use of new information technologies. Nowadays, new information and communication technologies are in constant evolution. They have introduced themselves into our world, transforming our previous vision of them, leading to new opportunities. For this reason, the present Project seeks to unite the use of e-learning and the Business and Corporate world. The main objective is the design, in an e-learning platform, of a technical course that will train nuclear power plant control-room operators. The work carried out in this Project has been, in addition to the transformation of a face-to-face course into an online one, the obtainment of a methodology to employ in the future transformation of other courses. In order to achieve this mission, our interest must focus on the content as well as on the management of the various courses. Hence, the Project starts with basic definitions of e-learning terminology. Next, a methodology that applies knowledge management for the transformation of any face-to-face course into e-learning has been generated. Once this methodology is defined, it has been applied for the design process of the Inherent Coefficients of Reactivity course. Finally, an economic study has been developed in order to determine the viability of the Project and an economic model has been created to estimate the price of any given course

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Air Mass and atmosphere components (basically aerosol (AOD) and precipitable water (PW)) define the absorption of the sunlight that arrive to Earth. Radiative models such as SMARTS or MODTRAN use these parameters to generate an equivalent spectrum. However, complex and expensive instruments (as AERONET network devices) are needed to obtain AOD and PW. On the other hand, the use of isotype cells is a convenient way to characterize spectrally a place for CPV considering that they provide the photocurrent of the different internal subcells individually. Crossing data from AERONET station and a Tri-band Spectroheliometer, a model that correlates Spectral Mismatch Ratios and atmospheric parameters is proposed. Considering the amount of stations of AERONET network, this model may be used to estimate the spectral influence on energy performance of CPV systems close to all the stations worldwide.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, methods for computing D-optimal designs for population pharmacokinetic studies have become available. However there are few publications that have prospectively evaluated the benefits of D-optimality in population or single-subject settings. This study compared a population optimal design with an empirical design for estimating the base pharmacokinetic model for enoxaparin in a stratified randomized setting. The population pharmacokinetic D-optimal design for enoxaparin was estimated using the PFIM function (MATLAB version 6.0.0.88). The optimal design was based on a one-compartment model with lognormal between subject variability and proportional residual variability and consisted of a single design with three sampling windows (0-30 min, 1.5-5 hr and 11 - 12 hr post-dose) for all patients. The empirical design consisted of three sample time windows per patient from a total of nine windows that collectively represented the entire dose interval. Each patient was assigned to have one blood sample taken from three different windows. Windows for blood sampling times were also provided for the optimal design. Ninety six patients were recruited into the study who were currently receiving enoxaparin therapy. Patients were randomly assigned to either the optimal or empirical sampling design, stratified for body mass index. The exact times of blood samples and doses were recorded. Analysis was undertaken using NONMEM (version 5). The empirical design supported a one compartment linear model with additive residual error, while the optimal design supported a two compartment linear model with additive residual error as did the model derived from the full data set. A posterior predictive check was performed where the models arising from the empirical and optimal designs were used to predict into the full data set. This revealed the optimal'' design derived model was superior to the empirical design model in terms of precision and was similar to the model developed from the full dataset. This study suggests optimal design techniques may be useful, even when the optimized design was based on a model that was misspecified in terms of the structural and statistical models and when the implementation of the optimal designed study deviated from the nominal design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Risk-ranking protocols are used widely to classify the conservation status of the world's species. Here we report on the first empirical assessment of their reliability by using a retrospective study of 18 pairs of bird and mammal species (one species extinct and the other extant) with eight different assessors. The performance of individual assessors varied substantially, but performance was improved by incorporating uncertainty in parameter estimates and consensus among the assessors. When this was done, the ranks from the protocols were consistent with the extinction outcome in 70-80% of pairs and there were mismatches in only 10-20% of cases. This performance was similar to the subjective judgements of the assessors after they had estimated the range and population parameters required by the protocols, and better than any single parameter. When used to inform subjective judgement, the protocols therefore offer a means of reducing unpredictable biases that may be associated with expert input and have the advantage of making the logic behind assessments explicit. We conclude that the protocols are useful for forecasting extinctions, although they are prone to some errors that have implications for conservation. Some level of error is to be expected, however, given the influence of chance on extinction. The performance of risk assessment protocols may be improved by providing training in the application of the protocols, incorporating uncertainty in parameter estimates and using consensus among multiple assessors, including some who are experts in the application of the protocols. Continued testing and refinement of the protocols may help to provide better absolute estimates of risk, particularly by re-evaluating how the protocols accommodate missing data.