977 resultados para Berkeley


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aims of the dissertation are to find the right description of the structure of perceptual experience and to explore the ways in which the structure of the body might serve to explain it. In the first two parts, I articulate and defend the claim that perceptual experience seems direct and the claim that its objects seem real. I defend these claims as integral parts of a coherent metaphysically neutral conception of perceptual experience. Sense-datum theorists, certain influential perceptual psychologists, and early modern philosophers (most notably Berkeley) all disputed the claim that perceptual experience seems direct. In Part I, I argue that the grounds on which they did so were poor. The aim is then, in Part II, to give a proper appreciation of the distinctive intentionality of perceptual experience whilst remaining metaphysically neutral. I do so by drawing on the early work of Edmund Husserl, providing a characterisation of the perceptual experience of objects as real, qua mind-independent particulars. In Part III, I explore two possible explanations of the structure characterising the intentionality of perceptual experience, both of which accord a distinctive explanatory role to the body. On one account, perceptual experience is structured by an implicit pre-reflective consciousness of oneself as a body engaged in perceptual activity. An alternative account makes no appeal to the metaphysically laden concept of a bodily self. It seeks to explain the structure of perceptual experience by appeal to anticipation of the structural constraints of the body. I develop this alternative by highlighting the conceptual and empirical basis for the idea that a first-order structural affordance relation holds between a bodily agent and certain properties of its body. I then close with a discussion of the shared background assumptions that ought to inform disputes over whether the body itself (in addition to its representation) ought to serve as an explanans in such an account.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In natural history studies of chronic disease, it is of interest to understand the evolution of key variables that measure aspects of disease progression. This is particularly true for immunological variables in persons infected with the Human Immunodeficiency Virus (HIV). The natural timescale for such studies is time since infection. However, most data available for analysis arise from prevalent cohorts, where the date of infection is unknown for most or all individuals. As a result, standard curve fitting algorithms are not immediately applicable. Here we propose two methods to circumvent this difficulty. The first uses repeated measurement data to provide information not only on the level of the variable of interest, but also on its rate of change, while the second uses an estimate of the expected time since infection. Both methods are based on the principal curves algorithm of Hastie and Stuetzle, and are applied to data from a prevalent cohort of HIV-infected homosexual men, giving estimates of the average pattern of CD4+ lymphocyte decline. These methods are applicable to natural history studies using data from prevalent cohorts where the time of disease origin is uncertain, provided certain ancillary information is available from external sources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In biostatistical applications interest often focuses on the estimation of the distribution of a time-until-event variable T. If one observes whether or not T exceeds an observed monitoring time at a random number of monitoring times, then the data structure is called interval censored data. We extend this data structure by allowing the presence of a possibly time-dependent covariate process that is observed until end of follow up. If one only assumes that the censoring mechanism satisfies coarsening at random, then, by the curve of dimensionality, typically no regular estimators will exist. To fight the curse of dimensionality we follow the approach of Robins and Rotnitzky (1992) by modeling parameters of the censoring mechanism. We model the right-censoring mechanism by modeling the hazard of the follow up time, conditional on T and the covariate process. For the monitoring mechanism we avoid modeling the joint distribution of the monitoring times by only modeling a univariate hazard of the pooled monitoring times, conditional on the follow up time, T, and the covariates process, which can be estimated by treating the pooled sample of monitoring times as i.i.d. In particular, it is assumed that the monitoring times and the right-censoring times only depend on T through the observed covariate process. We introduce inverse probability of censoring weighted (IPCW) estimator of the distribution of T and of smooth functionals thereof which are guaranteed to be consistent and asymptotically normal if we have available correctly specified semiparametric models for the two hazards of the censoring process. Furthermore, given such correctly specified models for these hazards of the censoring process, we propose a one-step estimator which will improve on the IPCW estimator if we correctly specify a lower-dimensional working model for the conditional distribution of T, given the covariate process, that remains consistent and asymptotically normal if this latter working model is misspecified. It is shown that the one-step estimator is efficient if each subject is at most monitored once and the working model contains the truth. In general, it is shown that the one-step estimator optimally uses the surrogate information if the working model contains the truth. It is not optimal in using the interval information provided by the current status indicators at the monitoring times, but simulations in Peterson, van der Laan (1997) show that the efficiency loss is small.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Common goals in epidemiologic studies of infectious diseases include identification of the infectious agent, description of the modes of transmission and characterization of factors that influence the probability of transmission from infected to uninfected individuals. In the case of AIDS, the agent has been identified as the Human Immunodeficiency Virus (HIV), and transmission is known to occur through a variety of contact mechanisms including unprotected sexual intercourse, transfusion of infected blood products and sharing of needles in intravenous drug use. Relatively little is known about the probability of IV transmission associated with the various modes of contact, or the role that other cofactors play in promoting or suppressing transmission. Here, transmission probability refers to the probability that the virus is transmitted to a susceptible individual following exposure consisting of a series of potentially infectious contacts. The infectivity of HIV for a given route of transmission is defined to be the per contact probability of infection. Knowledge of infectivity and its relationship to other factors is important in understanding the dynamics of the AIDS epidemic and in suggesting appropriate measures to control its spread. The primary source of empirical data about infectivity comes from sexual partners of infected individuals. Partner studies consist of a series of such partnerships, usually heterosexual and monogamous, each composed of an initially infected "index case" and a partner who may or may not be infected by the time of data collection. However, because the infection times of both partners may be unknown and the history of contacts uncertain, any quantitative characterization of infectivity is extremely difficult. Thus, most statistical analyses of partner study data involve the simplifying assumption that infectivity is a constant common to all partnerships. The major objectives of this work are to describe and discuss the design and analysis of partner studies, providing a general statistical framework for investigations of infectivity and risk factors for HIV transmission. The development is largely based on three papers: Jewell and Shiboski (1990), Kim and Lagakos (1990), and Shiboski and Jewell (1992).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A large number of proposals for estimating the bivariate survival function under random censoring has been made. In this paper we discuss nonparametric maximum likelihood estimation and the bivariate Kaplan-Meier estimator of Dabrowska. We show how these estimators are computed, present their intuitive background and compare their practical performance under different levels of dependence and censoring, based on extensive simulation results, which leads to a practical advise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Malani and Neilsen (1992) we have proposed alternative estimates of survival function (for time to disease) using a simple marker that describes time to some intermediate stage in a disease process. In this paper we derive the asymptotic variance of one such proposed estimator using two different methods and compare terms of order 1/n when there is no censoring. In the absence of censoring the asymptotic variance obtained using the Greenwood type approach converges to exact variance up to terms involving 1/n. But the asymptotic variance obtained using the theory of the counting process and results from Voelkel and Crowley (1984) on semi-Markov processes has a different term of order 1/n. It is not clear to us at this point why the variance formulae using the latter approach give different results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Jewell and Kalbfleisch (1992) consider the use of marker processes for applications related to estimation of the survival distribution of time to failure. Marker processes were assumed to be stochastic processes that, at a given point in time, provide information about the current hazard and consequently on the remaining time to failure. Particular attention was paid to calculations based on a simple additive model for the relationship between the hazard function at time t and the history of the marker process up until time t. Specific applications to the analysis of AIDS data included the use of markers as surrogate responses for onset of AIDS with censored data and as predictors of the time elapsed since infection in prevalent individuals. Here we review recent work on the use of marker data to tackle these kinds of problems with AIDS data. The Poisson marker process with an additive model, introduced in Jewell and Kalbfleisch (1992) may be a useful "test" example for comparison of various procedures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In biostatistical applications, interest often focuses on the estimation of the distribution of time T between two consecutive events. If the initial event time is observed and the subsequent event time is only known to be larger or smaller than an observed monitoring time, then the data is described by the well known singly-censored current status model, also known as interval censored data, case I. We extend this current status model by allowing the presence of a time-dependent process, which is partly observed and allowing C to depend on T through the observed part of this time-dependent process. Because of the high dimension of the covariate process, no globally efficient estimators exist with a good practical performance at moderate sample sizes. We follow the approach of Robins and Rotnitzky (1992) by modeling the censoring variable, given the time-variable and the covariate-process, i.e., the missingness process, under the restriction that it satisfied coarsening at random. We propose a generalization of the simple current status estimator of the distribution of T and of smooth functionals of the distribution of T, which is based on an estimate of the missingness. In this estimator the covariates enter only through the estimate of the missingness process. Due to the coarsening at random assumption, the estimator has the interesting property that if we estimate the missingness process more nonparametrically, then we improve its efficiency. We show that by local estimation of an optimal model or optimal function of the covariates for the missingness process, the generalized current status estimator for smooth functionals become locally efficient; meaning it is efficient if the right model or covariate is consistently estimated and it is consistent and asymptotically normal in general. Estimation of the optimal model requires estimation of the conditional distribution of T, given the covariates. Any (prior) knowledge of this conditional distribution can be used at this stage without any risk of losing root-n consistency. We also propose locally efficient one step estimators. Finally, we show some simulation results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the interplay of smoothness and monotonicity assumptions when estimating a density from a sample of observations. The nonparametric maximum likelihood estimator of a decreasing density on the positive half line attains a rate of convergence at a fixed point if the density has a negative derivative. The same rate is obtained by a kernel estimator, but the limit distributions are different. If the density is both differentiable and known to be monotone, then a third estimator is obtained by isotonization of a kernel estimator. We show that this again attains the rate of convergence and compare the limit distributors of the three types of estimators. It is shown that both isotonization and smoothing lead to a more concentrated limit distribution and we study the dependence on the proportionality constant in the bandwidth. We also show that isotonization does not change the limit behavior of a kernel estimator with a larger bandwidth, in the case that the density is known to have more than one derivative.