960 resultados para Competing risks, Estimation of predator mortality, Over dispersion, Stochastic modeling
Resumo:
A number of medical and social developments have had an impact on the neonatal mortality over the past ten to 15 years in the United States. The purpose of this study was to examine one of these developments, Newborn Intensive Care Units (NICUs), and evaluate their impact on neonatal mortality in Houston, Texas.^ This study was unique in that it used as its data base matched birth and infant death records from two periods of time: 1958-1960 (before NICUs) and 1974-1976 (after NICUs). The neonatal mortality of single, live infants born to Houston resident mothers was compared for two groups: infants born in hospitals which developed NICUs and infants born in all other Houston hospitals. Neonatal mortality comparisons were made using the following birth-characteristic variables: birthweight, gestation, race, sex, maternal age, legitimacy, birth order and prenatal care.^ The results of the study showed that hospitals which developed NICUs had a higher percentage of their population with high risk characteristics. In spite of this, they had lower neonatal mortality rates in two categories: (1) white 3.5-5.5 pounds birthweight infants, (2) low birthweight infants whose mothers received no prenatal care. Black 3.5-5.5 pounds birthweight infants did equally well in either hospital group. While the differences between the two hospital groups for these categories were not statistically significant at the p < 0.05 level, data from the 1958-1960 period substantiate that a marked change occurred in the 3.5-5.5 pounds birthweight category for those infants born in hospitals which developed NICUs. Early data were not available for prenatal care. These findings support the conclusion that, in Houston, NICUs had some impact on neonatal mortality among moderately underweight infants. ^
Resumo:
Carbon isotopically based estimates of CO2 levels have been generated from a record of the photosynthetic fractionation of 13C (epsilon p) in a central equatorial Pacific sediment core that spans the last ~255 ka. Contents of 13C in phytoplanktonic biomass were determined by analysis of C37 alkadienones. These compounds are exclusive products of Prymnesiophyte algae which at present grow most abundantly at depths of 70-90 m in the central equatorial Pacific. A record of the isotopic compostion of dissolved CO2 was constructed from isotopic analyses of the planktonic foraminifera Neogloboquadrina dutertrei, which calcifies at 70-90 m in the same region. Values of epsilon p, derived by comparison of the organic and inorganic delta values, were transformed to yield concentrations of dissolved CO2 (c e) based on a new, site-specific calibration of the relationship between epsilon p and c e. The calibration was based on reassessment of existing epsilon p versus c e data, which support a physiologically based model in which epsilon p is inversely related to c e. Values of PCO2, the partial pressure of CO2 that would be in equilibrium with the estimated concentrations of dissolved CO2, were calculated using Henry's law and the temperature determined from the alkenone-unsaturation index UK 37. Uncertainties in these values arise mainly from uncertainties about the appropriateness (particularly over time) of the site-specific relationship between epsilon p and 1/c e. These are discussed in detail and it is concluded that the observed record of epsilon p most probably reflects significant variations in Delta pCO2, the ocean-atmosphere disequilibrium, which appears to have ranged from ~110 µatm during glacial intervals (ocean > atmosphere) to ~60 µatm during interglacials. Fluxes of CO2 to the atmosphere would thus have been significantly larger during glacial intervals. If this were characteristic of large areas of the equatorial Pacific, then greater glacial sinks for the equatorially evaded CO2 must have existed elsewhere. Statistical analysis of air-sea pCO2 differences and other parameters revealed significant (p < 0.01) inverse correlations of Delta pCO2 with sea surface temperature and with the mass accumulation rate of opal. The former suggests response to the strength of upwelling, the latter may indicate either drawdown of CO2 by siliceous phytoplankton or variation of [CO2]/[Si(OH)4] ratios in upwelling waters.
Resumo:
Qualitative and quantitative food composition, as well as intensity of feeding of beryx-alfonsino Beryx splendens was examined on banks near the Azores. Data are presented with respect to size groups and taking into account type of feeding of males and females. Crustaceans and fishes were constituents of their feeding ration. A tendency toward increase in the number of consumed fishes in the course of ontogenetic development of beryx-alfonsino was noted. Beryx-alfonsino was shown to occupy the trophic level of consumers of the third order performing function of a deep-water predator.
Resumo:
The Subtropical Front (STF) marking the northern boundary of the Southern Ocean has a steep gradient in sea surface temperature (SST) of approximately 4°C over 0.5° of latitude. Presently, in the region south of Tasmania, the STF lies nominally at 47°S in the summer and 45°S in the winter. We present here SST reconstructions in a latitudinal transect of cores across the South Tasman Rise, southeast of Australia, during the late Quaternary. SST reconstructions are based on two paleotemperature proxies, alkenones and faunal assemblages, which are used to assess past changes in SST in spring and summer. The north-south alignment in core locations allows reconstruction of movement of the STF over the last 100 ka. Surface water temperatures during the last glaciation in this region were ~4°C colder than today. Additional temperature changes greater in magnitude than 4°C seen in individual cores can be attributed to changes in the water mass overlying the core site caused by the movement of the front across that location. During the penultimate interglacial, SST was ~2°C warmer and the STF was largely positioned south of 47°S. Movement of the STF to the north occurred during cool climate periods such as the last marine isotope stages 3 and 4. In the last glaciation, the front was at its farthest north position, becoming pinned against the Tasmanian landmass. It moved south by 4° latitude to 47°S in summer during the deglaciation but remained north of 45°S in spring throughout the early deglaciation. After 11 ka B.P. inferred invigoration of the East Australia Current appears to have pushed the STF seasonally south of the East Tasman Plateau, until after 6 ka B.P. when it achieved its present configuration.
Resumo:
The optimum quality that can be asymptotically achieved in the estimation of a probability p using inverse binomial sampling is addressed. A general definition of quality is used in terms of the risk associated with a loss function that satisfies certain assumptions. It is shown that the limit superior of the risk for p asymptotically small has a minimum over all (possibly randomized) estimators. This minimum is achieved by certain non-randomized estimators. The model includes commonly used quality criteria as particular cases. Applications to the non-asymptotic regime are discussed considering specific loss functions, for which minimax estimators are derived.
Resumo:
A comprehensive assessment of nitrogen (N) flows at the landscape scale is fundamental to understand spatial interactions in the N cascade and to inform the development of locally optimised N management strategies. To explore these interactions, complete N budgets were estimated for two contrasting hydrological catchments (dominated by agricultural grassland vs. semi-natural peat-dominated moorland), forming part of an intensively studied landscape in southern Scotland. Local scale atmospheric dispersion modelling and detailed farm and field inventories provided high resolution estimations of input fluxes. Direct agricultural inputs (i.e. grazing excreta, N2 fixation, organic and synthetic fertiliser) accounted for most of the catchment N inputs, representing 82% in the grassland and 62% in the moorland catchment, while atmospheric deposition made a significant contribution, particularly in the moorland catchment, contributing 38% of the N inputs. The estimated catchment N budgets highlighted areas of key uncertainty, particularly N2 exchange and stream N export. The resulting N balances suggest that the study catchments have a limited capacity to store N within soils, vegetation and groundwater. The "catchment N retention", i.e. the amount of N which is either stored within the catchment or lost through atmospheric emissions, was estimated to be 13% of the net anthropogenic input in the moorland and 61% in the grassland catchment. These values contrast with regional scale estimates: Catchment retentions of net anthropogenic input estimated within Europe at the regional scale range from 50% to 90%, with an average of 82% (Billen et al., 2011). This study emphasises the need for detailed budget analyses to identify the N status of European landscapes.
Resumo:
One of the most promising areas in which probabilistic graphical models have shown an incipient activity is the field of heuristic optimization and, in particular, in Estimation of Distribution Algorithms. Due to their inherent parallelism, different research lines have been studied trying to improve Estimation of Distribution Algorithms from the point of view of execution time and/or accuracy. Among these proposals, we focus on the so-called distributed or island-based models. This approach defines several islands (algorithms instances) running independently and exchanging information with a given frequency. The information sent by the islands can be either a set of individuals or a probabilistic model. This paper presents a comparative study for a distributed univariate Estimation of Distribution Algorithm and a multivariate version, paying special attention to the comparison of two alternative methods for exchanging information, over a wide set of parameters and problems ? the standard benchmark developed for the IEEE Workshop on Evolutionary Algorithms and other Metaheuristics for Continuous Optimization Problems of the ISDA 2009 Conference. Several analyses from different points of view have been conducted to analyze both the influence of the parameters and the relationships between them including a characterization of the configurations according to their behavior on the proposed benchmark.
Resumo:
Esta tesis doctoral presenta un procedimiento integral de control de calidad en centrales fotovoltaicas, que comprende desde la fase inicial de estimación de las expectativas de producción hasta la vigilancia del funcionamiento de la instalación una vez en operación, y que permite reducir la incertidumbre asociada su comportamiento y aumentar su fiabilidad a largo plazo, optimizando su funcionamiento. La coyuntura de la tecnología fotovoltaica ha evolucionado enormemente en los últimos años, haciendo que las centrales fotovoltaicas sean capaces de producir energía a unos precios totalmente competitivos en relación con otras fuentes de energía. Esto hace que aumente la exigencia sobre el funcionamiento y la fiabilidad de estas instalaciones. Para cumplir con dicha exigencia, es necesaria la adecuación de los procedimientos de control de calidad aplicados, así como el desarrollo de nuevos métodos que deriven en un conocimiento más completo del estado de las centrales, y que permitan mantener la vigilancia sobre las mismas a lo largo del tiempo. Además, los ajustados márgenes de explotación actuales requieren que durante la fase de diseño se disponga de métodos de estimación de la producción que comporten la menor incertidumbre posible. La propuesta de control de calidad presentada en este trabajo parte de protocolos anteriores orientados a la fase de puesta en marcha de una instalación fotovoltaica, y las complementa con métodos aplicables a la fase de operación, prestando especial atención a los principales problemas que aparecen en las centrales a lo largo de su vida útil (puntos calientes, impacto de la suciedad, envejecimiento…). Además, incorpora un protocolo de vigilancia y análisis del funcionamiento de las instalaciones a partir de sus datos de monitorización, que incluye desde la comprobación de la validez de los propios datos registrados hasta la detección y el diagnóstico de fallos, y que permite un conocimiento automatizado y detallado de las plantas. Dicho procedimiento está orientado a facilitar las tareas de operación y mantenimiento, de manera que se garantice una alta disponibilidad de funcionamiento de la instalación. De vuelta a la fase inicial de cálculo de las expectativas de producción, se utilizan los datos registrados en las centrales para llevar a cabo una mejora de los métodos de estimación de la radiación, que es la componente que más incertidumbre añade al proceso de modelado. El desarrollo y la aplicación de este procedimiento de control de calidad se han llevado a cabo en 39 grandes centrales fotovoltaicas, que totalizan una potencia de 250 MW, distribuidas por varios países de Europa y América Latina. ABSTRACT This thesis presents a comprehensive quality control procedure to be applied in photovoltaic plants, which covers from the initial phase of energy production estimation to the monitoring of the installation performance, once it is in operation. This protocol allows reducing the uncertainty associated to the photovoltaic plants behaviour and increases their long term reliability, therefore optimizing their performance. The situation of photovoltaic technology has drastically evolved in recent years, making photovoltaic plants capable of producing energy at fully competitive prices, in relation to other energy sources. This fact increases the requirements on the performance and reliability of these facilities. To meet this demand, it is necessary to adapt the quality control procedures and to develop new methods able to provide a more complete knowledge of the state of health of the plants, and able to maintain surveillance on them over time. In addition, the current meagre margins in which these installations operate require procedures capable of estimating energy production with the lower possible uncertainty during the design phase. The quality control procedure presented in this work starts from previous protocols oriented to the commissioning phase of a photovoltaic system, and complete them with procedures for the operation phase, paying particular attention to the major problems that arise in photovoltaic plants during their lifetime (hot spots, dust impact, ageing...). It also incorporates a protocol to control and analyse the installation performance directly from its monitoring data, which comprises from checking the validity of the recorded data itself to the detection and diagnosis of failures, and which allows an automated and detailed knowledge of the PV plant performance that can be oriented to facilitate the operation and maintenance of the installation, so as to ensure a high operation availability of the system. Back to the initial stage of calculating production expectations, the data recorded in the photovoltaic plants is used to improved methods for estimating the incident irradiation, which is the component that adds more uncertainty to the modelling process. The development and implementation of the presented quality control procedure has been carried out in 39 large photovoltaic plants, with a total power of 250 MW, located in different European and Latin-American countries.
Resumo:
Estimation of evolutionary distances has always been a major issue in the study of molecular evolution because evolutionary distances are required for estimating the rate of evolution in a gene, the divergence dates between genes or organisms, and the relationships among genes or organisms. Other closely related issues are the estimation of the pattern of nucleotide substitution, the estimation of the degree of rate variation among sites in a DNA sequence, and statistical testing of the molecular clock hypothesis. Mathematical treatments of these problems are considerably simplified by the assumption of a stationary process in which the nucleotide compositions of the sequences under study have remained approximately constant over time, and there now exist fairly extensive studies of stationary models of nucleotide substitution, although some problems remain to be solved. Nonstationary models are much more complex, but significant progress has been recently made by the development of the paralinear and LogDet distances. This paper reviews recent studies on the above issues and reports results on correcting the estimation bias of evolutionary distances, the estimation of the pattern of nucleotide substitution, and the estimation of rate variation among the sites in a sequence.
Resumo:
This paper deals with the estimation of a time-invariant channel spectrum from its own nonuniform samples, assuming there is a bound on the channel’s delay spread. Except for this last assumption, this is the basic estimation problem in systems providing channel spectral samples. However, as shown in the paper, the delay spread bound leads us to view the spectrum as a band-limited signal, rather than the Fourier transform of a tapped delay line (TDL). Using this alternative model, a linear estimator is presented that approximately minimizes the expected root-mean-square (RMS) error for a deterministic channel. Its main advantage over the TDL is that it takes into account the spectrum’s smoothness (time width), thus providing a performance improvement. The proposed estimator is compared numerically with the maximum likelihood (ML) estimator based on a TDL model in pilot-assisted channel estimation (PACE) for OFDM.
Resumo:
In this study, a methodology based in a dynamical framework is proposed to incorporate additional sources of information to normalized difference vegetation index (NDVI) time series of agricultural observations for a phenological state estimation application. The proposed implementation is based on the particle filter (PF) scheme that is able to integrate multiple sources of data. Moreover, the dynamics-led design is able to conduct real-time (online) estimations, i.e., without requiring to wait until the end of the campaign. The evaluation of the algorithm is performed by estimating the phenological states over a set of rice fields in Seville (SW, Spain). A Landsat-5/7 NDVI series of images is complemented with two distinct sources of information: SAR images from the TerraSAR-X satellite and air temperature information from a ground-based station. An improvement in the overall estimation accuracy is obtained, especially when the time series of NDVI data is incomplete. Evaluations on the sensitivity to different development intervals and on the mitigation of discontinuities of the time series are also addressed in this work, demonstrating the benefits of this data fusion approach based on the dynamic systems.
Resumo:
Background: Sentinel node biopsy (SNB) is being increasingly used but its place outside randomized trials has not yet been established. Methods: The first 114 sentinel node (SN) biopsies performed for breast cancer at the Princess Alexandra Hospital from March 1999 to June 2001 are presented. In 111 cases axillary dissection was also performed, allowing the accuracy of the technique to be assessed. A standard combination of preoperative lymphoscintigraphy, intraoperative gamma probe and injection of blue dye was used in most cases. Results are discussed in relation to the risk and potential consequences of understaging. Results: Where both probe and dye were used, the SN was identified in 90% of patients. A significant number of patients were treated in two stages and the technique was no less effective in patients who had SNB performed at a second operation after the primary tumour had already been removed. The interval from radioisotope injection to operation was very wide (between 2 and 22 h) and did not affect the outcome. Nodal metastases were present in 42 patients in whom an SN was found, and in 40 of these the SN was positive, giving a false negative rate of 4.8% (2/42), with the overall percentage of patients understaged being 2%. For this particular group as a whole, the increased risk of death due to systemic therapy being withheld as a consequence of understaging (if SNB alone had been employed) is estimated at less than 1/500. The risk for individuals will vary depending on other features of the particular primary tumour. Conclusion: For patients who elect to have the axilla staged using SNB alone, the risk and consequences of understaging need to be discussed. These risks can be estimated by allowing for the specific surgeon's false negative rate for the technique, and considering the likelihood of nodal metastases for a given tumour. There appears to be no disadvantage with performing SNB at a second operation after the primary tumour has already been removed. Clearly, for a large number of patients, SNB alone will be safe, but ideally participation in randomized trials should continue to be encouraged.
Resumo:
Water-sampler equilibrium partitioning coefficients and aqueous boundary layer mass transfer coefficients for atrazine, diuron, hexazionone and fluometuron onto C18 and SDB-RPS Empore disk-based aquatic passive samplers have been determined experimentally under a laminar flow regime (Re = 5400). The method involved accelerating the time to equilibrium of the samplers by exposing them to three water concentrations, decreasing stepwise to 50% and then 25% of the original concentration. Assuming first-order Fickian kinetics across a rate-limiting aqueous boundary layer, both parameters are determined computationally by unconstrained nonlinear optimization. In addition, a method of estimation of mass transfer coefficients-therefore sampling rates-using the dimensionless Sherwood correlation developed for laminar flow over a flat plate is applied. For each of the herbicides, this correlation is validated to within 40% of the experimental data. The study demonstrates that for trace concentrations (sub 0.1 mu g/L) and these flow conditions, a naked Empore disk performs well as an integrative sampler over short deployments (up to 7 days) for the range of polar herbicides investigated. The SDB-RPS disk allows a longer integrative period than the C18 disk due to its higher sorbent mass and/or its more polar sorbent chemistry. This work also suggests that for certain passive sampler designs, empirical estimation of sampling rates may be possible using correlations that have been available in the chemical engineering literature for some time.
Resumo:
The water retention curve (WRC) is a hydraulic characteristic of concrete required for advanced modeling of water (and thus solute) transport in variably saturated, heterogeneous concrete. Unfortunately, determination by a direct experimental method (for example, measuring equilibrium moisture levels of large samples stored in constant humidity cells) is a lengthy process, taking over 2 years for large samples. A surrogate approach is presented in which the WRC is conveniently estimated from mercury intrusion porosimetry (MIP) and validated by water sorption isotherms: The well-known Barrett, Joyner and Halenda (BJH) method of estimating the pore size distribution (PSD) from the water sorption isotherm is shown to complement the PSD derived from conventional MIP. This provides a basis for predicting the complete WRC from MIP data alone. The van Genuchten equation is used to model the combined water sorption and MIP results. It is a convenient tool for describing water retention characteristics over the full moisture content range. The van Genuchten parameter estimation based solely on MIP is shown to give a satisfactory approximation to the WRC, with a simple restriction on one. of the parameters.
Resumo:
Error free transmission of a single polarisation optical time division multiplexed 40 Gbit/s dispersion managed pulse data stream over 1009 km has been achieved in a dispersion compensated standard (non-dispersion shifted) fibre. This distance is twice the previous record at this data rate, and was acheived through techniques developed for dispersion managed soliton transmission.