984 resultados para Right censored data
Resumo:
Jewell and Kalbfleisch (1992) consider the use of marker processes for applications related to estimation of the survival distribution of time to failure. Marker processes were assumed to be stochastic processes that, at a given point in time, provide information about the current hazard and consequently on the remaining time to failure. Particular attention was paid to calculations based on a simple additive model for the relationship between the hazard function at time t and the history of the marker process up until time t. Specific applications to the analysis of AIDS data included the use of markers as surrogate responses for onset of AIDS with censored data and as predictors of the time elapsed since infection in prevalent individuals. Here we review recent work on the use of marker data to tackle these kinds of problems with AIDS data. The Poisson marker process with an additive model, introduced in Jewell and Kalbfleisch (1992) may be a useful "test" example for comparison of various procedures.
Resumo:
Information is widely regarded as one of the key concepts of modern society. The production, distribution and use of information are some of the key aspects of modern economies. Driven by technological progress information has become a good in its own right. This established an information economy and challenged the law to provide an apt framework suitable to promote the production of information, enable its distribution and efficient allocation, and deal with the risks inherent in information technology. Property rights are a major component of such a framework. However, information as an object of property rights is not limited to intellectual property but may also occur as personality aspects or even tangible property. Accordingly, information as property can be found in the area of intellectual property, personality protection and other property rights. This essay attempts to categorize three different types of information that can be understood as a good in the economic sense and an object in the legal sense: semantic information, syntactic information and structural information. It shows how legal ownership of such information is established by different subjective rights. In addition the widespread debate regarding the justification of intellectual property rights is demonstrated from the wider perspective of informational property in general. Finally, in light of current debates, this essay explores whether “data producers” shall have a new kind of property right in data.
Resumo:
The Antarctic continental slope spans the depths from the shelf break (usually between 500 and 1000 m) to ~3000 m, is very steep, overlain by 'warm' (2-2.5 °C) Circumpolar Deep Water (CDW), and life there is poorly studied. This study investigates whether life on Antarctica's continental slope is essentially an extension of the shelf or the abyssal fauna, a transition zone between these or clearly distinct in its own right. Using data from several cruises to the Weddell Sea and Scotia Sea, including the ANDEEP (ANtarctic benthic DEEP-sea biodiversity, colonisation history and recent community patterns) I-III, BIOPEARL (Biodiversity, Phylogeny, Evolution and Adaptive Radiation of Life in Antarctica) 1 and EASIZ (Ecology of the Antarctic Sea Ice Zone) II cruises as well as current databases (SOMBASE, SCAR-MarBIN), four different taxa were selected (i.e. cheilostome bryozoans, isopod and ostracod crustaceans and echinoid echinoderms) and two areas, the Weddell Sea and the Scotia Sea, to examine faunal composition, richness and affinities. The answer has important ramifications to the link between physical oceanography and ecology, and the potential of the slope to act as a refuge and resupply zone to the shelf during glaciations. Benthic samples were collected using Agassiz trawl, epibenthic sledge and Rauschert sled. By bathymetric definition, these data suggest that despite eurybathy in some of the groups examined and apparent similarity of physical conditions in the Antarctic, the shelf, slope and abyssal faunas were clearly separated in the Weddell Sea. However, no such separation of faunas was apparent in the Scotia Sea (except in echinoids). Using a geomorphological definition of the slope, shelf-slope-abyss similarity only changed significantly in the bryozoans. Our results did not support the presence of a homogenous and unique Antarctic slope fauna despite a high number of species being restricted to the slope. However, it remains the case that there may be a unique Antarctic slope fauna, but the paucity of our samples could not demonstrate this in the Scotia Sea. It is very likely that various ecological and evolutionary factors (such as topography, water-mass and sediment characteristics, input of particulate organic carbon (POC) and glaciological history) drive slope distinctness. Isopods showed greatest species richness at slope depths, whereas bryozoans and ostracods were more speciose at shelf depths; however, significance varied across Weddell Sea and Scotia Sea and depending on bathymetric vs. geomorphological definitions. Whilst the slope may harbour some source populations for localised shelf recolonisation, the absence of many shelf species, genera and even families (in a poorly dispersing taxon) from the continental slope indicate that it was not a universal refuge for Antarctic shelf fauna.
Resumo:
El estudio de la fiabilidad de componentes y sistemas tiene gran importancia en diversos campos de la ingenieria, y muy concretamente en el de la informatica. Al analizar la duracion de los elementos de la muestra hay que tener en cuenta los elementos que no fallan en el tiempo que dure el experimento, o bien los que fallen por causas distintas a la que es objeto de estudio. Por ello surgen nuevos tipos de muestreo que contemplan estos casos. El mas general de ellos, el muestreo censurado, es el que consideramos en nuestro trabajo. En este muestreo tanto el tiempo hasta que falla el componente como el tiempo de censura son variables aleatorias. Con la hipotesis de que ambos tiempos se distribuyen exponencialmente, el profesor Hurt estudio el comportamiento asintotico del estimador de maxima verosimilitud de la funcion de fiabilidad. En principio parece interesante utilizar metodos Bayesianos en el estudio de la fiabilidad porque incorporan al analisis la informacion a priori de la que se dispone normalmente en problemas reales. Por ello hemos considerado dos estimadores Bayesianos de la fiabilidad de una distribucion exponencial que son la media y la moda de la distribucion a posteriori. Hemos calculado la expansion asint6tica de la media, varianza y error cuadratico medio de ambos estimadores cuando la distribuci6n de censura es exponencial. Hemos obtenido tambien la distribucion asintotica de los estimadores para el caso m3s general de que la distribucion de censura sea de Weibull. Dos tipos de intervalos de confianza para muestras grandes se han propuesto para cada estimador. Los resultados se han comparado con los del estimador de maxima verosimilitud, y con los de dos estimadores no parametricos: limite producto y Bayesiano, resultando un comportamiento superior por parte de uno de nuestros estimadores. Finalmente nemos comprobado mediante simulacion que nuestros estimadores son robustos frente a la supuesta distribuci6n de censura, y que uno de los intervalos de confianza propuestos es valido con muestras pequenas. Este estudio ha servido tambien para confirmar el mejor comportamiento de uno de nuestros estimadores. SETTING OUT AND SUMMARY OF THE THESIS When we study the lifetime of components it's necessary to take into account the elements that don't fail during the experiment, or those that fail by reasons which are desirable to exclude from consideration. The model of random censorship is very usefull for analysing these data. In this model the time to failure and the time censor are random variables. We obtain two Bayes estimators of the reliability function of an exponential distribution based on randomly censored data. We have calculated the asymptotic expansion of the mean, variance and mean square error of both estimators, when the censor's distribution is exponential. We have obtained also the asymptotic distribution of the estimators for the more general case of censor's Weibull distribution. Two large-sample confidence bands have been proposed for each estimator. The results have been compared with those of the maximum likelihood estimator, and with those of two non parametric estimators: Product-limit and Bayesian. One of our estimators has the best behaviour. Finally we have shown by simulation, that our estimators are robust against the assumed censor's distribution, and that one of our intervals does well in small sample situation.
Resumo:
Los continuos avances tecnológicos están trayendo consigo nuevas formas de almacenar, tratar y comunicar datos personales. Es necesario repensar el derecho fundamental a la protección de datos, y arbitrar mecanismos para adaptarlo a las nuevas formas de tratamiento. a nivel europeo se está trabajando en una nueva propuesta de regulación que consideramos, en general, muy apropiada para afrontar los nuevos retos en esta materia. para ejemplificar todo esto, en el presente estudio se plantea de forma detallada el caso de la computación en nube, sus principales características y algunas preocupaciones acerca de los riesgos potenciales que su utilización trae consigo. Abstract: Rapid technological developments are bringing new ways to store, process and communicate personal data. We need to rethink the fundamental right to data protection and adapt it to new forms of treatment. there is a new «european» proposal for a regulation on the protection of individuals with regard to the processing of personal data, well suited to meet the new challenges. this study offers one example of this: the cloud computing, its main characteristics and some concerns about the potential risks that its use entails.
Resumo:
Neste trabalho, foi proposta uma nova família de distribuições, a qual permite modelar dados de sobrevivência quando a função de risco tem formas unimodal e U (banheira). Ainda, foram consideradas as modificações das distribuições Weibull, Fréchet, half-normal generalizada, log-logística e lognormal. Tomando dados não-censurados e censurados, considerou-se os estimadores de máxima verossimilhança para o modelo proposto, a fim de verificar a flexibilidade da nova família. Além disso, um modelo de regressão locação-escala foi utilizado para verificar a influência de covariáveis nos tempos de sobrevida. Adicionalmente, conduziu-se uma análise de resíduos baseada nos resíduos deviance modificada. Estudos de simulação, utilizando-se de diferentes atribuições dos parâmetros, porcentagens de censura e tamanhos amostrais, foram conduzidos com o objetivo de verificar a distribuição empírica dos resíduos tipo martingale e deviance modificada. Para detectar observações influentes, foram utilizadas medidas de influência local, que são medidas de diagnóstico baseadas em pequenas perturbações nos dados ou no modelo proposto. Podem ocorrer situações em que a suposição de independência entre os tempos de falha e censura não seja válida. Assim, outro objetivo desse trabalho é considerar o mecanismo de censura informativa, baseado na verossimilhança marginal, considerando a distribuição log-odd log-logística Weibull na modelagem. Por fim, as metodologias descritas são aplicadas a conjuntos de dados reais.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Anterior segment optical coherent tomography (AS-OCT, Visante; Zeiss) is used to examine meridional variation in anterior scleral thickness (AST) and its association with refractive error, ethnicity and gender. Scleral cross-sections of 74 individuals (28 males; 46 females; aged between 18-40 years (27.7±5.3)) were sampled twice in random order in 8 meridians: [superior (S), inferior (I), nasal (N), temporal (T), superior-temporal (ST), superior-nasal (SN), inferior-temporal (IT) and inferior-nasal (IN)]. AST was measured in 1mm anterior-toposterior increments (designated the A-P distance) from the scleral spur (SS) over a 6mm distance. Axial length and refractive error were measured with a Zeiss IOLMaster biometer and an open-view binocular Shin-Nippon autorefractor. Intra- And inter-observer variability of AST was assessed for each of the 8 meridians. Mixed repeated measures ANOVAs tested meridional and A-P distance differences in AST with refractive error, gender and ethnicity. Only right eye data were analysed. AST (mean±SD) across all meridians and A-P distances was 725±46μm. Meridian SN was the thinnest (662±57μm) and I the thickest (806 ±60μm). Significant differences were found between all meridians (p<0.001), except S:ST, IT:IN, IT:N and IN:N. Significant differences between A-P distances were found except between SS and 6 mm and between 2 and 4mm. AST measurements at 1mm (682±48 μm) were the thinnest and at 6mm (818±49 μm) the thickest (p<0.001); a significant interaction occurred between meridians and A-P distances (p<0.001). AST was significantly greater (p<0.001) in male subjects but no significant differences were found between refractive error or ethnicity. Significant variations in AST occur with regard to meridian and distance from the SS and may have utility in selecting optimum sites for pharmaceutical or surgical intervention.
Resumo:
2000 Mathematics Subject Classification: 62E16,62F15, 62H12, 62M20.
Resumo:
AIM To compare the survival rates of Class II Atraumatic Restorative Treatment (ART) restorations placed in primary molars using cotton rolls or rubber dam as isolation methods. METHODS A total of 232 children, 6-7 years old, both genders, were selected having one primary molar with proximal dentine lesion. The children were randomly assigned into two groups: control group with Class II ART restoration made using cotton rolls and experimental group using rubber dam. The restorations were evaluated by eight calibrated evaluators (Kappa > 0.8) after 6, 12, 18 and 24 months. RESULTS A total of 48 (20.7%) children were considered dropout, after 24 months. The cumulative survival rate after 6, 12, 18 and 24 months was 61.4%, 39.0%, 29.1% and 18.0%, respectively for the control group, and 64.1%, 55.1%, 40.1% and 32.1%, respectively for the rubber dam group. The log rank test for censored data showed no statistical significant difference between the groups (P = 0.07). The univariate Cox Regression showed no statistical significant difference after adjusting for independent variables (P > 0.05). CONCLUSION Both groups had similar survival rates, and after 2 years, the use of rubber dam does not increase the success of Class II ART restorations significantly.
Resumo:
Historically, the health risk of mycotoxins had been evaluated on the basis of single-chemical and single-exposure pathway scenarios. However, the co-contamination of foodstuffs with these compounds is being reported at an increasing rate and a multiple-exposure scenario for humans and vulnerable population groups as children is urgently needed. Cereals are among the first solid foods eaten by child and thus constitute an important food group of their diet. Few data are available relatively to early stages child´s exposure to mycotoxins through consumption of cereal-based foods. The present study aims to perform the cumulative risk assessment of mycotoxins present in a set of cereal-based foods including breakfast cereals (BC), processed cereal-based foods (PCBF) and biscuits (BT), consumed by children (1 to 3 years old, n=75) from Lisbon region, Portugal. Children food consumption and occurrence of 12 mycotoxins (aflatoxins, ochratoxin A, fumonisins and trichothecenes) in cereal-based foods were combined to estimate the mycotoxin daily intake, using deterministic and probabilistic approaches. Different strategies were used to treat the left censored data. For aflatoxins, as carcinogenic compounds, the margin of exposure (MoE) was calculated as a ratio of BMDL (benchmark dose lower confidence limit) and aflatoxin daily exposure. For the remaining mycotoxins, the output of exposure was compared to the dose reference values (TDI) in order to calculate the hazard quotients (HQ, ratio between exposure and a reference dose). The concentration addition (CA) concept was used for the cumulative risk assessment of multiple mycotoxins. The combined margin of exposure (MoET) and the hazard index (HI) were calculated for aflatoxins and the remaining mycotoxins, respectively. Main results revealed a significant health concern related to aflatoxins and especially aflatoxin M1 exposure according to the MoET and MoE values (below 10000), respectively. HQ and HI values for the remaining mycotoxins were below 1, revealing a low concern from a public health point of view. These are the first results on cumulative risk assessment of multiple mycotoxins present in cereal-based foods consumed by children. Considering the present results, more research studies are needed to provide the governmental regulatory bodies with data to develop an approach that contemplate the human exposure and, particularly, children, to multiple mycotoxins in food. The last issue is particularly important considering the potential synergistic effects that could occur between mycotoxins and its potential impact on human and, mainly, children health.
Resumo:
It is nowadays recognized that the risk of human co-exposure to multiple mycotoxins is real. In the last years, a number of studies have approached the issue of co-exposure and the best way to develop a more precise and realistic assessment. Likewise, the growing concern about the combined effects of mycotoxins and their potential impact on human health has been reflected by the increasing number of toxicological studies on the combined toxicity of these compounds. Nevertheless, risk assessment of these toxins, still follows the conventional paradigm of single exposure and single effects, incorporating only the possibility of additivity but not taking into account the complex dynamics associated to interactions between different mycotoxins or between mycotoxins and other food contaminants. Considering that risk assessment is intimately related to the establishment of regulatory guidelines, once the risk assessment is completed, an effort to reduce or manage the risk should be followed to protect public health. Risk assessment of combined human exposure to multiple mycotoxins thus poses several challenges to scientists, risk assessors and risk managers and opens new avenues for research. This presentation aims to give an overview of the different challenges posed by the likelihood of human co-exposure to mycotoxins and the possibility of interactive effects occurring after absorption, towards knowledge generation to support a more accurate human risk assessment and risk management. For this purpose, a physiologically-based framework that includes knowledge on the bioaccessibility, toxicokinetics and toxicodynamics of multiple toxins is proposed. Regarding exposure assessment, the need of harmonized food consumption data, availability of multianalyte methods for mycotoxin quantification, management of left-censored data and use of probabilistic models will be highlight, in order to develop a more precise and realistic exposure assessment. On the other hand, the application of predictive mathematical models to estimate mycotoxins’ combined effects from in vitro toxicity studies will be also discussed. Results from a recent Portuguese project aimed at exploring the toxic effects of mixtures of mycotoxins in infant foods and their potential health impact will be presented as a case study, illustrating the different aspects of risk assessment highlighted in this presentation. Further studies on hazard and exposure assessment of multiple mycotoxins, using harmonized approaches and methodologies, will be crucial towards an improvement in data quality and contributing to holistic risk assessment and risk management strategies for multiple mycotoxins in foodstuffs.
Resumo:
People, animals and the environment can be exposed to multiple chemicals at once from a variety of sources, but current risk assessment is usually carried out based on one chemical substance at a time. In human health risk assessment, ingestion of food is considered a major route of exposure to many contaminants, namely mycotoxins, a wide group of fungal secondary metabolites that are known to potentially cause toxicity and carcinogenic outcomes. Mycotoxins are commonly found in a variety of foods including those intended for consumption by infants and young children and have been found in processed cereal-based foods available in the Portuguese market. The use of mathematical models, including probabilistic approaches using Monte Carlo simulations, constitutes a prominent issue in human health risk assessment in general and in mycotoxins exposure assessment in particular. The present study aims to characterize, for the first time, the risk associated with the exposure of Portuguese children to single and multiple mycotoxins present in processed cereal-based foods (CBF). Portuguese children (0-3 years old) food consumption data (n=103) were collected using a 3 days food diary. Contamination data concerned the quantification of 12 mycotoxins (aflatoxins, ochratoxin A, fumonisins and trichothecenes) were evaluated in 20 CBF samples marketed in 2014 and 2015 in Lisbon; samples were analyzed by HPLC-FLD, LC-MS/MS and GC-MS. Daily exposure of children to mycotoxins was performed using deterministic and probabilistic approaches. Different strategies were used to treat the left censored data. For aflatoxins, as carcinogenic compounds, the margin of exposure (MoE) was calculated as a ratio of BMDL (benchmark dose lower confidence limit) to the aflatoxin exposure. The magnitude of the MoE gives an indication of the risk level. For the remaining mycotoxins, the output of exposure was compared to the dose reference values (TDI) in order to calculate the hazard quotients (ratio between exposure and a reference dose, HQ). For the cumulative risk assessment of multiple mycotoxins, the concentration addition (CA) concept was used. The combined margin of exposure (MoET) and the hazard index (HI) were calculated for aflatoxins and the remaining mycotoxins, respectively. 71% of CBF analyzed samples were contaminated with mycotoxins (with values below the legal limits) and approximately 56% of the studied children consumed CBF at least once in these 3 days. Preliminary results showed that children exposure to single mycotoxins present in CBF were below the TDI. Aflatoxins MoE and MoET revealed a reduced potential risk by exposure through consumption of CBF (with values around 10000 or more). HQ and HI values for the remaining mycotoxins were below 1. Children are a particularly vulnerable population group to food contaminants and the present results point out an urgent need to establish legal limits and control strategies regarding the presence of multiple mycotoxins in children foods in order to protect their health. The development of packaging materials with antifungal properties is a possible solution to control the growth of moulds and consequently to reduce mycotoxin production, contributing to guarantee the quality and safety of foods intended for children consumption.
Resumo:
El presente trabajo tiene por objeto el análisis de la figura delderecho al olvido en Internet. Esta figura será analizada, desdesus orígenes, como meras solicitudes de tutelas de derechospresentadas ante las agencias de protección de datos de los paíseseuropeos, hasta su aplicación actual, configurado tal derecho comoun derecho cuasi-fundamental, enmarcado dentro de la esferadel derecho fundamental a la protección de datos. También, seráobjeto de análisis la labor desarrollada por la Agencia Españolade Protección de Datos, que fue la primera Agencia Europea quedecidió condenar a los motores de búsqueda por el mal tratamientode datos producido en su funcionamiento. Fruto de su labor,estos casos llegaron a la Audiencia Nacional, la cual realizó unabrillantísima, desde un punto de vista técnico, cuestión prejudicial,acerca de la actitud de los motores de búsqueda en relación conel establecimiento del derecho al olvido.
Resumo:
Abstract Big data nowadays is a fashionable topic, independently of what people mean when they use this term. But being big is just a matter of volume, although there is no clear agreement in the size threshold. On the other hand, it is easy to capture large amounts of data using a brute force approach. So the real goal should not be big data but to ask ourselves, for a given problem, what is the right data and how much of it is needed. For some problems this would imply big data, but for the majority of the problems much less data will and is needed. In this talk we explore the trade-offs involved and the main problems that come with big data using the Web as case study: scalability, redundancy, bias, noise, spam, and privacy. Speaker Biography Ricardo Baeza-Yates Ricardo Baeza-Yates is VP of Research for Yahoo Labs leading teams in United States, Europe and Latin America since 2006 and based in Sunnyvale, California, since August 2014. During this time he has lead the labs in Barcelona and Santiago de Chile. Between 2008 and 2012 he also oversaw the Haifa lab. He is also part time Professor at the Dept. of Information and Communication Technologies of the Universitat Pompeu Fabra, in Barcelona, Spain. During 2005 he was an ICREA research professor at the same university. Until 2004 he was Professor and before founder and Director of the Center for Web Research at the Dept. of Computing Science of the University of Chile (in leave of absence until today). He obtained a Ph.D. in CS from the University of Waterloo, Canada, in 1989. Before he obtained two masters (M.Sc. CS & M.Eng. EE) and the electronics engineer degree from the University of Chile in Santiago. He is co-author of the best-seller Modern Information Retrieval textbook, published in 1999 by Addison-Wesley with a second enlarged edition in 2011, that won the ASIST 2012 Book of the Year award. He is also co-author of the 2nd edition of the Handbook of Algorithms and Data Structures, Addison-Wesley, 1991; and co-editor of Information Retrieval: Algorithms and Data Structures, Prentice-Hall, 1992, among more than 500 other publications. From 2002 to 2004 he was elected to the board of governors of the IEEE Computer Society and in 2012 he was elected for the ACM Council. He has received the Organization of American States award for young researchers in exact sciences (1993), the Graham Medal for innovation in computing given by the University of Waterloo to distinguished ex-alumni (2007), the CLEI Latin American distinction for contributions to CS in the region (2009), and the National Award of the Chilean Association of Engineers (2010), among other distinctions. In 2003 he was the first computer scientist to be elected to the Chilean Academy of Sciences and since 2010 is a founding member of the Chilean Academy of Engineering. In 2009 he was named ACM Fellow and in 2011 IEEE Fellow.