994 resultados para quadrat-variance methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: This study assessed whether breast cancer (BC) patients express similar levels of needs for equivalent severity of symptoms, functioning difficulties, or degrees of satisfaction with care aspects. BC patients who did (or not) report needs in spite of similar difficulties were identified among their sociodemographic or clinical characteristics. PATIENTS AND METHODS: Three hundred and eighty-four (73% response rate) BC patients recruited in ambulatory or surgery hospital services completed the European Organisation for Research and Treatment of Cancer Quality of Life questionnaire (EORTC QLQ)-C30 quality of life [health-related quality of life (HRQOL)], the EORTC IN-PATSAT32 (in-patient) or OUT-PATSAT35 (out-patient) satisfaction with care, and the supportive care needs survey short form 34-item (SCNS-SF34) measures. RESULTS: HRQOL or satisfaction with care scale scores explained 41%, 45%, 40% and 22% of variance in, respectively, psychological, physical/daily living needs, information/health system, and care/support needs (P < 0.001). BC patients' education level, having children, hospital service attendance, and anxiety/depression levels significantly predicted differences in psychological needs relative to corresponding difficulties (adjusted R(2) = 0.11). Medical history and anxiety/depression levels significantly predicted differences in information/health system needs relative to degrees of satisfaction with doctors, nurses, or radiotherapy technicians and general satisfaction (adjusted R(2) = 0.12). Unmet needs were most prevalent in the psychological domains across hospital services. CONCLUSIONS: Assessment of needs, HRQOL, and satisfaction with care highlights the subgroups of BC patients requiring better supportive care targeting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adrenal chromaffin cells synthesize and secrete catecholamines and neuropeptides that may regulate hormonal and paracrine signaling in stress and also during inflammation. The aim of our work was to study the role of the cytokine interleukin-1beta (IL-1beta) on catecholamine release and synthesis from primary cell cultures of human adrenal chromaffin cells. The effect of IL-1beta on neuropeptide Y (NPY) release and the intracellular pathways involved in catecholamine release evoked by IL-1beta and NPY were also investigated. We observed that IL-1beta increases the release of NPY, norepinephrine (NE), and epinephrine (EP) from human chromaffin cells. Moreover, the immunoneutralization of released NPY inhibits catecholamine release evoked by IL-1beta. Moreover, IL-1beta regulates catecholamine synthesis as the inhibition of tyrosine hydroxylase decreases IL-1beta-evoked catecholamine release and the cytokine induces tyrosine hydroxylase Ser40 phosphorylation. Moreover, IL-1beta induces catecholamine release by a mitogen-activated protein kinase (MAPK)-dependent mechanism, and by nitric oxide synthase activation. Furthermore, MAPK, protein kinase C (PKC), protein kinase A (PKA), and nitric oxide (NO) production are involved in catecholamine release evoked by NPY. Using human chromaffin cells, our data suggest that IL-1beta, NPY, and nitric oxide (NO) may contribute to a regulatory loop between the immune and the adrenal systems, and this is relevant in pathological conditions such as infection, trauma, stress, or in hypertension.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present paper proposes a model for the persistence of abnormal returnsboth at firm and industry levels, when longitudinal data for the profitsof firms classiffied as industries are available. The model produces a two-way variance decomposition of abnormal returns: (a) at firm versus industrylevels, and (b) for permanent versus transitory components. This variancedecomposition supplies information on the relative importance of thefundamental components of abnormal returns that have been discussed in theliterature. The model is applied to a Spanish sample of firms, obtainingresults such as: (a) there are significant and permanent differences betweenprofit rates both at industry and firm levels; (b) variation of abnormal returnsat firm level is greater than at industry level; and (c) firm and industry levelsdo not differ significantly regarding rates of convergence of abnormal returns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The release of transmitters from glia influences synaptic functions. The modalities and physiological functions of glial release are poorly understood. Here we show that glutamate exocytosis from astrocytes of the rat hippocampal dentate molecular layer enhances synaptic strength at excitatory synapses between perforant path afferents and granule cells. The effect is mediated by ifenprodil-sensitive NMDA ionotropic glutamate receptors and involves an increase of transmitter release at the synapse. Correspondingly, we identify NMDA receptor 2B subunits on the extrasynaptic portion of excitatory nerve terminals. The receptor distribution is spatially related to glutamate-containing synaptic-like microvesicles in the apposed astrocytic processes. This glial regulatory pathway is endogenously activated by neuronal activity-dependent stimulation of purinergic P2Y1 receptors on the astrocytes. Thus, we provide the first combined functional and ultrastructural evidence for a physiological control of synaptic activity via exocytosis of glutamate from astrocytes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper contrasts empirically the results of alternative methods for estimating thevalue and the depreciation of mineral resources. The historical data of Mexico andVenezuela, covering the period 1920s-1980s, is used to contrast the results of severalmethods. These are the present value, the net price method, the user cost method andthe imputed income method. The paper establishes that the net price and the user costare not competing methods as such, but alternative adjustments to different scenariosof closed and open economies. The results prove that the biases of the methods, ascommonly described in the theoretical literature, only hold under the most restrictedscenario of constant rents over time. It is argued that the difference between what isexpected to happen and what actually did happen is for the most part due to a missingvariable, namely technological change. This is an important caveat to therecommendations made based on these models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider the problem of testing k hypotheses simultaneously. In this paper,we discuss finite and large sample theory of stepdown methods that providecontrol of the familywise error rate (FWE). In order to improve upon theBonferroni method or Holm's (1979) stepdown method, Westfall and Young(1993) make eective use of resampling to construct stepdown methods thatimplicitly estimate the dependence structure of the test statistics. However,their methods depend on an assumption called subset pivotality. The goalof this paper is to construct general stepdown methods that do not requiresuch an assumption. In order to accomplish this, we take a close look atwhat makes stepdown procedures work, and a key component is a monotonicityrequirement of critical values. By imposing such monotonicity on estimatedcritical values (which is not an assumption on the model but an assumptionon the method), it is demonstrated that the problem of constructing a validmultiple test procedure which controls the FWE can be reduced to the problemof contructing a single test which controls the usual probability of a Type 1error. This reduction allows us to draw upon an enormous resamplingliterature as a general means of test contruction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bacteria are generally difficult specimens to prepare for conventional resin section electron microscopy and mycobacteria, with their thick and complex cell envelope layers being especially prone to artefacts. Here we made a systematic comparison of different methods for preparing Mycobacterium smegmatis for thin section electron microscopy analysis. These methods were: (1) conventional preparation by fixatives and epoxy resins at ambient temperature. (2) Tokuyasu cryo-section of chemically fixed bacteria. (3) rapid freezing followed by freeze substitution and embedding in epoxy resin at room temperature or (4) combined with Lowicryl HM20 embedding and ultraviolet (UV) polymerization at low temperature and (5) CEMOVIS, or cryo electron microscopy of vitreous sections. The best preservation of bacteria was obtained with the cryo electron microscopy of vitreous sections method, as expected, especially with respect to the preservation of the cell envelope and lipid bodies. By comparison with cryo electron microscopy of vitreous sections both the conventional and Tokuyasu methods produced different, undesirable artefacts. The two different types of freeze-substitution protocols showed variable preservation of the cell envelope but gave acceptable preservation of the cytoplasm, but not lipid bodies, and bacterial DNA. In conclusion although cryo electron microscopy of vitreous sections must be considered the 'gold standard' among sectioning methods for electron microscopy, because it avoids solvents and stains, the use of optimally prepared freeze substitution also offers some advantages for ultrastructural analysis of bacteria.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we address a problem arising in risk management; namely the study of price variations of different contingent claims in the Black-Scholes model due to anticipating future events. The method we propose to use is an extension of the classical Vega index, i.e. the price derivative with respect to the constant volatility, in thesense that we perturb the volatility in different directions. Thisdirectional derivative, which we denote the local Vega index, will serve as the main object in the paper and one of the purposes is to relate it to the classical Vega index. We show that for all contingent claims studied in this paper the local Vega index can be expressed as a weighted average of the perturbation in volatility. In the particular case where the interest rate and the volatility are constant and the perturbation is deterministic, the local Vega index is an average of this perturbation multiplied by the classical Vega index. We also study the well-known goal problem of maximizing the probability of a perfect hedge and show that the speed of convergence is in fact dependent of the local Vega index.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Switzerland has the lowest adolescent fertility rate in Western Europe. According to data collected in 1993 as part of the Swiss Multicentre Adolescent Survey on Health, 5% of 1,726 sexually active adolescents in a group of 3,993 15-20-year-old women enrolled in academic or vocational classes had ever been pregnant; most of these women (80%) had terminated their pregnancy. Adolescents who had ever been pregnant did not differ significantly from those who had not by demographic characteristics. Multiple logistic regression analysis identified seven factors associated with pregnancy: having had four or more sexual partners; not having used contraceptives at first intercourse; ever use of less-effective contraceptive methods; having used illicit drugs during the last 30 days; living apart from one's parents; recently experiencing stress; and perceiving a lack of future prospects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The question of where retroviral DNA becomes integrated in chromosomes is important for understanding (i) the mechanisms of viral growth, (ii) devising new anti-retroviral therapy, (iii) understanding how genomes evolve, and (iv) developing safer methods for gene therapy. With the completion of genome sequences for many organisms, it has become possible to study integration targeting by cloning and sequencing large numbers of host-virus DNA junctions, then mapping the host DNA segments back onto the genomic sequence. This allows statistical analysis of the distribution of integration sites relative to the myriad types of genomic features that are also being mapped onto the sequence scaffold. Here we present methods for recovering and analyzing integration site sequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AbstractText BACKGROUND: Profiling sperm DNA present on vaginal swabs taken from rape victims often contributes to identifying and incarcerating rapists. Large amounts of the victim's epithelial cells contaminate the sperm present on swabs, however, and complicate this process. The standard method for obtaining relatively pure sperm DNA from a vaginal swab is to digest the epithelial cells with Proteinase K in order to solubilize the victim's DNA, and to then physically separate the soluble DNA from the intact sperm by pelleting the sperm, removing the victim's fraction, and repeatedly washing the sperm pellet. An alternative approach that does not require washing steps is to digest with Proteinase K, pellet the sperm, remove the victim's fraction, and then digest the residual victim's DNA with a nuclease. METHODS: The nuclease approach has been commercialized in a product, the Erase Sperm Isolation Kit (PTC Labs, Columbia, MO, USA), and five crime laboratories have tested it on semen-spiked female buccal swabs in a direct comparison with their standard methods. Comparisons have also been performed on timed post-coital vaginal swabs and evidence collected from sexual assault cases. RESULTS: For the semen-spiked buccal swabs, Erase outperformed the standard methods in all five laboratories and in most cases was able to provide a clean male profile from buccal swabs spiked with only 1,500 sperm. The vaginal swabs taken after consensual sex and the evidence collected from rape victims showed a similar pattern of Erase providing superior profiles. CONCLUSIONS: In all samples tested, STR profiles of the male DNA fractions obtained with Erase were as good as or better than those obtained using the standard methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most methods for small-area estimation are based on composite estimators derived from design- or model-based methods. A composite estimator is a linear combination of a direct and an indirect estimator with weights that usually depend on unknown parameters which need to be estimated. Although model-based small-area estimators are usually based on random-effects models, the assumption of fixed effects is at face value more appropriate.Model-based estimators are justified by the assumption of random (interchangeable) area effects; in practice, however, areas are not interchangeable. In the present paper we empirically assess the quality of several small-area estimators in the setting in which the area effects are treated as fixed. We consider two settings: one that draws samples from a theoretical population, and another that draws samples from an empirical population of a labor force register maintained by the National Institute of Social Security (NISS) of Catalonia. We distinguish two types of composite estimators: a) those that use weights that involve area specific estimates of bias and variance; and, b) those that use weights that involve a common variance and a common squared bias estimate for all the areas. We assess their precision and discuss alternatives to optimizing composite estimation in applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Principal curves have been defined Hastie and Stuetzle (JASA, 1989) assmooth curves passing through the middle of a multidimensional dataset. They are nonlinear generalizations of the first principalcomponent, a characterization of which is the basis for the principalcurves definition.In this paper we propose an alternative approach based on a differentproperty of principal components. Consider a point in the space wherea multivariate normal is defined and, for each hyperplane containingthat point, compute the total variance of the normal distributionconditioned to belong to that hyperplane. Choose now the hyperplaneminimizing this conditional total variance and look for thecorresponding conditional mean. The first principal component of theoriginal distribution passes by this conditional mean and it isorthogonal to that hyperplane. This property is easily generalized todata sets with nonlinear structure. Repeating the search from differentstarting points, many points analogous to conditional means are found.We call them principal oriented points. When a one-dimensional curveruns the set of these special points it is called principal curve oforiented points. Successive principal curves are recursively definedfrom a generalization of the total variance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper establishes a general framework for metric scaling of any distance measure between individuals based on a rectangular individuals-by-variables data matrix. The method allows visualization of both individuals and variables as well as preserving all the good properties of principal axis methods such as principal components and correspondence analysis, based on the singular-value decomposition, including the decomposition of variance into components along principal axes which provide the numerical diagnostics known as contributions. The idea is inspired from the chi-square distance in correspondence analysis which weights each coordinate by an amount calculated from the margins of the data table. In weighted metric multidimensional scaling (WMDS) we allow these weights to be unknown parameters which are estimated from the data to maximize the fit to the original distances. Once this extra weight-estimation step is accomplished, the procedure follows the classical path in decomposing a matrix and displaying its rows and columns in biplots.