912 resultados para Competing risks, Estimation of predator mortality, Over dispersion, Stochastic modeling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several approaches for the non-invasive MRI-based measurement of the aortic pressure waveform over the heart cycle have been proposed in the last years. These methods are normally based on time-resolved, two-dimensional phase-contrast sequences with uni-directionally encoded velocities (2D PC-MRI). In contrast, three-dimensional acquisitions with tridirectional velocity encoding (4D PC-MRI) have been shown to be a suitable data source for detailed investigations of blood flow and spatial blood pressure maps. In order to avoid additional MR acquisitions, it would be advantageous if the aortic pressure waveform could also be computed from this particular form of MRI. Therefore, we propose an approach for the computation of the aortic pressure waveform which can be completely performed using 4D PC-MRI. After the application of a segmentation algorithm, the approach automatically computes the aortic pressure waveform without any manual steps. We show that our method agrees well with catheter measurements in an experimental phantom setup and produces physiologically realistic results in three healthy volunteers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Folk wisdom and popular literature hold that, in the face of death, individuals tend to regret things in their lives that they have done or failed to do. Terror Management Theory (TMT), in contrast, allows for the prediction that individuals who are confronted with death try to minimize the experience of regret in order to retain a positive self-esteem. Three experiments put these competing perspectives to test. Drawing on TMT, we hypothesized and found that participants primed with their own death regret fewer things than control-group participants. This pattern of results cannot be attributed to differing types of regrets (Study 1). Furthermore, we provide evidence suggesting that the effect is not purely a product of cognitive mechanisms such as differing levels of construal (Study 2), cognitive contrast, or deficits (Study 3). Rather, the reported results are best explained in terms of a motivational coping mechanism: When death is salient, individuals strive to bolster as well as protect their self-esteem and accordingly try to minimize the experience of regret. The results add to our conceptual understanding of regret and TMT, and suggest that a multitude of lifestyle guidebooks need updating.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIMS In the dual antiplatelet therapy (DAPT) study, continued thienopyridine beyond 12 months after drug-eluting stent placement was associated with increased mortality compared with placebo. We sought to evaluate factors related to mortality in randomized patients receiving either drug-eluting or bare metal stents in the DAPT study. METHODS AND RESULTS Patients were enrolled after coronary stenting, given thienopyridine and aspirin for 12 months, randomly assigned to continued thienopyridine or placebo for an additional 18 months (while taking aspirin), and subsequently treated with aspirin alone for another 3 months. A blinded independent adjudication committee evaluated deaths. Among 11 648 randomized patients, rates of all-cause mortality rates were 1.9 vs. 1.5% (continued thienopyridine vs. placebo, P = 0.07), cardiovascular mortality, 1.0 vs. 1.0% (P = 0.97), and non-cardiovascular mortality, 0.9 vs. 0.5% (P = 0.01) over the randomized period (Months 12-30). Rates of fatal bleeding were 0.2 vs. 0.1% (P = 0.81), and deaths related to any prior bleeding were 0.3 vs. 0.2% (P = 0.36), Months 12-33). Cancer incidence did not differ (2.0 vs. 1.6%, P = 0.12). Cancer-related deaths occurred in 0.6 vs. 0.3% (P = 0.02) and were rarely related to bleeding (0.1 vs. 0, P = 0.25). After excluding those occurring in patients with cancer diagnosed before enrolment, rates were 0.4 vs. 0.3% (P = 0.16). CONCLUSION Bleeding accounted for a minority of deaths among patients treated with continued thienopyridine. Cancer-related death in association with thienopyridine therapy was mainly not related to bleeding and may be a chance finding. Caution is warranted when considering extended thienopyridine in patients with advanced cancer. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT00977938.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data derived from 1,194 gravidas presenting at the observation unit of a city/county hospital between October 11, 1979 through December 7, 1979 were evaluated with respect to the proportion ingesting drugs during pregnancy. The mean age of the mother at the time of the interview was 22.0 years; 43.0 percent were Black; 34.0 percent Latin-American, 21.0 percent White and 2.0 percent other; mean gravida was 2.5 pregnancies; mean parity was 1.0; and mean number of previous abortions was 0.34. Completed interview data was available for 1,119 gravida, corresponding urinalyses for 997 subjects. Ninety and one-tenth percent (90.1 percent) of the subjects reported ingestion of one or more drug preparation(s) (prescription, OTC, or substances used for recreational purposes) during pregnancy with a range of 0 to 11 substances and a mean of 2.7. Dietary supplements (vitamins and minerals) were most frequently reported followed by non-narcotic analgesics. Seventy-six and one tenth percent (76.1 percent) of the population reported consumption of prescription medication, 42.5 percent reported consumption of over-the-counter medications, 45.7 percent reported consumption of a substance for recreational purposes and 4.3 percent reported illicit consumption of a substance. For selected substances, no measurable difference was found between obtaining the information from the interview method or from a urinalysis assay. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A number of medical and social developments have had an impact on the neonatal mortality over the past ten to 15 years in the United States. The purpose of this study was to examine one of these developments, Newborn Intensive Care Units (NICUs), and evaluate their impact on neonatal mortality in Houston, Texas.^ This study was unique in that it used as its data base matched birth and infant death records from two periods of time: 1958-1960 (before NICUs) and 1974-1976 (after NICUs). The neonatal mortality of single, live infants born to Houston resident mothers was compared for two groups: infants born in hospitals which developed NICUs and infants born in all other Houston hospitals. Neonatal mortality comparisons were made using the following birth-characteristic variables: birthweight, gestation, race, sex, maternal age, legitimacy, birth order and prenatal care.^ The results of the study showed that hospitals which developed NICUs had a higher percentage of their population with high risk characteristics. In spite of this, they had lower neonatal mortality rates in two categories: (1) white 3.5-5.5 pounds birthweight infants, (2) low birthweight infants whose mothers received no prenatal care. Black 3.5-5.5 pounds birthweight infants did equally well in either hospital group. While the differences between the two hospital groups for these categories were not statistically significant at the p < 0.05 level, data from the 1958-1960 period substantiate that a marked change occurred in the 3.5-5.5 pounds birthweight category for those infants born in hospitals which developed NICUs. Early data were not available for prenatal care. These findings support the conclusion that, in Houston, NICUs had some impact on neonatal mortality among moderately underweight infants. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Carbon isotopically based estimates of CO2 levels have been generated from a record of the photosynthetic fractionation of 13C (epsilon p) in a central equatorial Pacific sediment core that spans the last ~255 ka. Contents of 13C in phytoplanktonic biomass were determined by analysis of C37 alkadienones. These compounds are exclusive products of Prymnesiophyte algae which at present grow most abundantly at depths of 70-90 m in the central equatorial Pacific. A record of the isotopic compostion of dissolved CO2 was constructed from isotopic analyses of the planktonic foraminifera Neogloboquadrina dutertrei, which calcifies at 70-90 m in the same region. Values of epsilon p, derived by comparison of the organic and inorganic delta values, were transformed to yield concentrations of dissolved CO2 (c e) based on a new, site-specific calibration of the relationship between epsilon p and c e. The calibration was based on reassessment of existing epsilon p versus c e data, which support a physiologically based model in which epsilon p is inversely related to c e. Values of PCO2, the partial pressure of CO2 that would be in equilibrium with the estimated concentrations of dissolved CO2, were calculated using Henry's law and the temperature determined from the alkenone-unsaturation index UK 37. Uncertainties in these values arise mainly from uncertainties about the appropriateness (particularly over time) of the site-specific relationship between epsilon p and 1/c e. These are discussed in detail and it is concluded that the observed record of epsilon p most probably reflects significant variations in Delta pCO2, the ocean-atmosphere disequilibrium, which appears to have ranged from ~110 µatm during glacial intervals (ocean > atmosphere) to ~60 µatm during interglacials. Fluxes of CO2 to the atmosphere would thus have been significantly larger during glacial intervals. If this were characteristic of large areas of the equatorial Pacific, then greater glacial sinks for the equatorially evaded CO2 must have existed elsewhere. Statistical analysis of air-sea pCO2 differences and other parameters revealed significant (p < 0.01) inverse correlations of Delta pCO2 with sea surface temperature and with the mass accumulation rate of opal. The former suggests response to the strength of upwelling, the latter may indicate either drawdown of CO2 by siliceous phytoplankton or variation of [CO2]/[Si(OH)4] ratios in upwelling waters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Qualitative and quantitative food composition, as well as intensity of feeding of beryx-alfonsino Beryx splendens was examined on banks near the Azores. Data are presented with respect to size groups and taking into account type of feeding of males and females. Crustaceans and fishes were constituents of their feeding ration. A tendency toward increase in the number of consumed fishes in the course of ontogenetic development of beryx-alfonsino was noted. Beryx-alfonsino was shown to occupy the trophic level of consumers of the third order performing function of a deep-water predator.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Subtropical Front (STF) marking the northern boundary of the Southern Ocean has a steep gradient in sea surface temperature (SST) of approximately 4°C over 0.5° of latitude. Presently, in the region south of Tasmania, the STF lies nominally at 47°S in the summer and 45°S in the winter. We present here SST reconstructions in a latitudinal transect of cores across the South Tasman Rise, southeast of Australia, during the late Quaternary. SST reconstructions are based on two paleotemperature proxies, alkenones and faunal assemblages, which are used to assess past changes in SST in spring and summer. The north-south alignment in core locations allows reconstruction of movement of the STF over the last 100 ka. Surface water temperatures during the last glaciation in this region were ~4°C colder than today. Additional temperature changes greater in magnitude than 4°C seen in individual cores can be attributed to changes in the water mass overlying the core site caused by the movement of the front across that location. During the penultimate interglacial, SST was ~2°C warmer and the STF was largely positioned south of 47°S. Movement of the STF to the north occurred during cool climate periods such as the last marine isotope stages 3 and 4. In the last glaciation, the front was at its farthest north position, becoming pinned against the Tasmanian landmass. It moved south by 4° latitude to 47°S in summer during the deglaciation but remained north of 45°S in spring throughout the early deglaciation. After 11 ka B.P. inferred invigoration of the East Australia Current appears to have pushed the STF seasonally south of the East Tasman Plateau, until after 6 ka B.P. when it achieved its present configuration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The optimum quality that can be asymptotically achieved in the estimation of a probability p using inverse binomial sampling is addressed. A general definition of quality is used in terms of the risk associated with a loss function that satisfies certain assumptions. It is shown that the limit superior of the risk for p asymptotically small has a minimum over all (possibly randomized) estimators. This minimum is achieved by certain non-randomized estimators. The model includes commonly used quality criteria as particular cases. Applications to the non-asymptotic regime are discussed considering specific loss functions, for which minimax estimators are derived.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A comprehensive assessment of nitrogen (N) flows at the landscape scale is fundamental to understand spatial interactions in the N cascade and to inform the development of locally optimised N management strategies. To explore these interactions, complete N budgets were estimated for two contrasting hydrological catchments (dominated by agricultural grassland vs. semi-natural peat-dominated moorland), forming part of an intensively studied landscape in southern Scotland. Local scale atmospheric dispersion modelling and detailed farm and field inventories provided high resolution estimations of input fluxes. Direct agricultural inputs (i.e. grazing excreta, N2 fixation, organic and synthetic fertiliser) accounted for most of the catchment N inputs, representing 82% in the grassland and 62% in the moorland catchment, while atmospheric deposition made a significant contribution, particularly in the moorland catchment, contributing 38% of the N inputs. The estimated catchment N budgets highlighted areas of key uncertainty, particularly N2 exchange and stream N export. The resulting N balances suggest that the study catchments have a limited capacity to store N within soils, vegetation and groundwater. The "catchment N retention", i.e. the amount of N which is either stored within the catchment or lost through atmospheric emissions, was estimated to be 13% of the net anthropogenic input in the moorland and 61% in the grassland catchment. These values contrast with regional scale estimates: Catchment retentions of net anthropogenic input estimated within Europe at the regional scale range from 50% to 90%, with an average of 82% (Billen et al., 2011). This study emphasises the need for detailed budget analyses to identify the N status of European landscapes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the most promising areas in which probabilistic graphical models have shown an incipient activity is the field of heuristic optimization and, in particular, in Estimation of Distribution Algorithms. Due to their inherent parallelism, different research lines have been studied trying to improve Estimation of Distribution Algorithms from the point of view of execution time and/or accuracy. Among these proposals, we focus on the so-called distributed or island-based models. This approach defines several islands (algorithms instances) running independently and exchanging information with a given frequency. The information sent by the islands can be either a set of individuals or a probabilistic model. This paper presents a comparative study for a distributed univariate Estimation of Distribution Algorithm and a multivariate version, paying special attention to the comparison of two alternative methods for exchanging information, over a wide set of parameters and problems ? the standard benchmark developed for the IEEE Workshop on Evolutionary Algorithms and other Metaheuristics for Continuous Optimization Problems of the ISDA 2009 Conference. Several analyses from different points of view have been conducted to analyze both the influence of the parameters and the relationships between them including a characterization of the configurations according to their behavior on the proposed benchmark.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tesis doctoral presenta un procedimiento integral de control de calidad en centrales fotovoltaicas, que comprende desde la fase inicial de estimación de las expectativas de producción hasta la vigilancia del funcionamiento de la instalación una vez en operación, y que permite reducir la incertidumbre asociada su comportamiento y aumentar su fiabilidad a largo plazo, optimizando su funcionamiento. La coyuntura de la tecnología fotovoltaica ha evolucionado enormemente en los últimos años, haciendo que las centrales fotovoltaicas sean capaces de producir energía a unos precios totalmente competitivos en relación con otras fuentes de energía. Esto hace que aumente la exigencia sobre el funcionamiento y la fiabilidad de estas instalaciones. Para cumplir con dicha exigencia, es necesaria la adecuación de los procedimientos de control de calidad aplicados, así como el desarrollo de nuevos métodos que deriven en un conocimiento más completo del estado de las centrales, y que permitan mantener la vigilancia sobre las mismas a lo largo del tiempo. Además, los ajustados márgenes de explotación actuales requieren que durante la fase de diseño se disponga de métodos de estimación de la producción que comporten la menor incertidumbre posible. La propuesta de control de calidad presentada en este trabajo parte de protocolos anteriores orientados a la fase de puesta en marcha de una instalación fotovoltaica, y las complementa con métodos aplicables a la fase de operación, prestando especial atención a los principales problemas que aparecen en las centrales a lo largo de su vida útil (puntos calientes, impacto de la suciedad, envejecimiento…). Además, incorpora un protocolo de vigilancia y análisis del funcionamiento de las instalaciones a partir de sus datos de monitorización, que incluye desde la comprobación de la validez de los propios datos registrados hasta la detección y el diagnóstico de fallos, y que permite un conocimiento automatizado y detallado de las plantas. Dicho procedimiento está orientado a facilitar las tareas de operación y mantenimiento, de manera que se garantice una alta disponibilidad de funcionamiento de la instalación. De vuelta a la fase inicial de cálculo de las expectativas de producción, se utilizan los datos registrados en las centrales para llevar a cabo una mejora de los métodos de estimación de la radiación, que es la componente que más incertidumbre añade al proceso de modelado. El desarrollo y la aplicación de este procedimiento de control de calidad se han llevado a cabo en 39 grandes centrales fotovoltaicas, que totalizan una potencia de 250 MW, distribuidas por varios países de Europa y América Latina. ABSTRACT This thesis presents a comprehensive quality control procedure to be applied in photovoltaic plants, which covers from the initial phase of energy production estimation to the monitoring of the installation performance, once it is in operation. This protocol allows reducing the uncertainty associated to the photovoltaic plants behaviour and increases their long term reliability, therefore optimizing their performance. The situation of photovoltaic technology has drastically evolved in recent years, making photovoltaic plants capable of producing energy at fully competitive prices, in relation to other energy sources. This fact increases the requirements on the performance and reliability of these facilities. To meet this demand, it is necessary to adapt the quality control procedures and to develop new methods able to provide a more complete knowledge of the state of health of the plants, and able to maintain surveillance on them over time. In addition, the current meagre margins in which these installations operate require procedures capable of estimating energy production with the lower possible uncertainty during the design phase. The quality control procedure presented in this work starts from previous protocols oriented to the commissioning phase of a photovoltaic system, and complete them with procedures for the operation phase, paying particular attention to the major problems that arise in photovoltaic plants during their lifetime (hot spots, dust impact, ageing...). It also incorporates a protocol to control and analyse the installation performance directly from its monitoring data, which comprises from checking the validity of the recorded data itself to the detection and diagnosis of failures, and which allows an automated and detailed knowledge of the PV plant performance that can be oriented to facilitate the operation and maintenance of the installation, so as to ensure a high operation availability of the system. Back to the initial stage of calculating production expectations, the data recorded in the photovoltaic plants is used to improved methods for estimating the incident irradiation, which is the component that adds more uncertainty to the modelling process. The development and implementation of the presented quality control procedure has been carried out in 39 large photovoltaic plants, with a total power of 250 MW, located in different European and Latin-American countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estimation of evolutionary distances has always been a major issue in the study of molecular evolution because evolutionary distances are required for estimating the rate of evolution in a gene, the divergence dates between genes or organisms, and the relationships among genes or organisms. Other closely related issues are the estimation of the pattern of nucleotide substitution, the estimation of the degree of rate variation among sites in a DNA sequence, and statistical testing of the molecular clock hypothesis. Mathematical treatments of these problems are considerably simplified by the assumption of a stationary process in which the nucleotide compositions of the sequences under study have remained approximately constant over time, and there now exist fairly extensive studies of stationary models of nucleotide substitution, although some problems remain to be solved. Nonstationary models are much more complex, but significant progress has been recently made by the development of the paralinear and LogDet distances. This paper reviews recent studies on the above issues and reports results on correcting the estimation bias of evolutionary distances, the estimation of the pattern of nucleotide substitution, and the estimation of rate variation among the sites in a sequence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with the estimation of a time-invariant channel spectrum from its own nonuniform samples, assuming there is a bound on the channel’s delay spread. Except for this last assumption, this is the basic estimation problem in systems providing channel spectral samples. However, as shown in the paper, the delay spread bound leads us to view the spectrum as a band-limited signal, rather than the Fourier transform of a tapped delay line (TDL). Using this alternative model, a linear estimator is presented that approximately minimizes the expected root-mean-square (RMS) error for a deterministic channel. Its main advantage over the TDL is that it takes into account the spectrum’s smoothness (time width), thus providing a performance improvement. The proposed estimator is compared numerically with the maximum likelihood (ML) estimator based on a TDL model in pilot-assisted channel estimation (PACE) for OFDM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, a methodology based in a dynamical framework is proposed to incorporate additional sources of information to normalized difference vegetation index (NDVI) time series of agricultural observations for a phenological state estimation application. The proposed implementation is based on the particle filter (PF) scheme that is able to integrate multiple sources of data. Moreover, the dynamics-led design is able to conduct real-time (online) estimations, i.e., without requiring to wait until the end of the campaign. The evaluation of the algorithm is performed by estimating the phenological states over a set of rice fields in Seville (SW, Spain). A Landsat-5/7 NDVI series of images is complemented with two distinct sources of information: SAR images from the TerraSAR-X satellite and air temperature information from a ground-based station. An improvement in the overall estimation accuracy is obtained, especially when the time series of NDVI data is incomplete. Evaluations on the sensitivity to different development intervals and on the mitigation of discontinuities of the time series are also addressed in this work, demonstrating the benefits of this data fusion approach based on the dynamic systems.