64 resultados para Mathematical and statistical techniques
Resumo:
Two stochastic production frontier models are formulated within the generalized production function framework popularized by Zellner and Revankar (Rev. Econ. Stud. 36 (1969) 241) and Zellner and Ryu (J. Appl. Econometrics 13 (1998) 101). This framework is convenient for parsimonious modeling of a production function with returns to scale specified as a function of output. Two alternatives for introducing the stochastic inefficiency term and the stochastic error are considered. In the first the errors are added to an equation of the form h(log y, theta) = log f (x, beta) where y denotes output, x is a vector of inputs and (theta, beta) are parameters. In the second the equation h(log y,theta) = log f(x, beta) is solved for log y to yield a solution of the form log y = g[theta, log f(x, beta)] and the errors are added to this equation. The latter alternative is novel, but it is needed to preserve the usual definition of firm efficiency. The two alternative stochastic assumptions are considered in conjunction with two returns to scale functions, making a total of four models that are considered. A Bayesian framework for estimating all four models is described. The techniques are applied to USDA state-level data on agricultural output and four inputs. Posterior distributions for all parameters, for firm efficiencies and for the efficiency rankings of firms are obtained. The sensitivity of the results to the returns to scale specification and to the stochastic specification is examined. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
The estimation of P(S-n > u) by simulation, where S, is the sum of independent. identically distributed random varibles Y-1,..., Y-n, is of importance in many applications. We propose two simulation estimators based upon the identity P(S-n > u) = nP(S, > u, M-n = Y-n), where M-n = max(Y-1,..., Y-n). One estimator uses importance sampling (for Y-n only), and the other uses conditional Monte Carlo conditioning upon Y1,..., Yn-1. Properties of the relative error of the estimators are derived and a numerical study given in terms of the M/G/1 queue in which n is replaced by an independent geometric random variable N. The conclusion is that the new estimators compare extremely favorably with previous ones. In particular, the conditional Monte Carlo estimator is the first heavy-tailed example of an estimator with bounded relative error. Further improvements are obtained in the random-N case, by incorporating control variates and stratification techniques into the new estimation procedures.
Resumo:
This work has demonstrated that for the first time a single RAFT agent (i. e., difunctional) can be used in conjunction with a radical initiator to obtain a desired M-n and PDI with controlled rates of polymerization. Simulations were used not only to verify the model but also to provide us with a predictive tool to generate other MWDs. It was also shown that all the MWDs prepared in this work could be translated to higher molecular weights through chain extension experiments with little or no compromise in the control of end group functionality. The ratio of monofunctional to difunctional SdC(CH2Ph)S- end groups, XPX and XP (where X) S=C(CH2Ph) S-), can be controlled by simply changing the concentration of initiator, AIBN. Importantly, the amount of dead polymer is extremely low and fulfils the criterion as suggested by Szwarc (Nature 1956) that to meet living requirements nonfunctional polymeric species formed by side reactions in the process should be undetectable by analytical techniques. In addition, this novel methodology will allow the synthesis of AB, ABA, and statistical multiblock copolymers with predetermined ratios to be produced in a one-pot reaction.
Resumo:
Non-technical losses (NTL) identification and prediction are important tasks for many utilities. Data from customer information system (CIS) can be used for NTL analysis. However, in order to accurately and efficiently perform NTL analysis, the original data from CIS need to be pre-processed before any detailed NTL analysis can be carried out. In this paper, we propose a feature selection based method for CIS data pre-processing in order to extract the most relevant information for further analysis such as clustering and classifications. By removing irrelevant and redundant features, feature selection is an essential step in data mining process in finding optimal subset of features to improve the quality of result by giving faster time processing, higher accuracy and simpler results with fewer features. Detailed feature selection analysis is presented in the paper. Both time-domain and load shape data are compared based on the accuracy, consistency and statistical dependencies between features.
Resumo:
NASA is working on complex future missions that require cooperation between multiple satellites or rovers. To implement these systems, developers are proposing and using intelligent and autonomous systems. These autonomous missions are new to NASA, and the software development community is just learning to develop such systems. With these new systems, new verification and validation techniques must be used. Current techniques have been developed based on large monolithic systems. These techniques have worked well and reliably, but do not translate to the new autonomous systems that are highly parallel and nondeterministic.
Resumo:
Using light and electron microscopic histological and immunocytochemical techniques, we investigated the effects of the glucocorticoid dexamethasone on T cell and macrophage apoptosis in the central nervous system (CNS) and peripheral nervous system (PNS) of Lewis rats with acute experimental autoimmune encephalomyelitis (EAE) induced with myelin basic protein (MBP). A single subcutaneous injection of dexamethasone markedly augmented T cell and macrophage apoptosis in the CNS and PNS and microglial apoptosis in the CNS within 6 hours (h). Pre-embedding immunolabeling revealed that dexamethasone increased the number of apoptotic CD5+ cells (T cells or activated B cells), αβ T cells, and CD11b+ cells (macrophages/microglia) in the meninges, perivascular spaces, and CNS parenchyma. The induction of increased apoptosis was dose-dependent. Daily dexamethasone treatment suppressed the neurological signs of EAE. However, the daily injection of a dose of dexamethasone (0.25 mg/kg). which, after a single dose, did not induce increased apoptosis in the CNS or PNS, was as effective in inhibiting the neurological signs of EAE as the high dose (4 mg/kg), which induced a marked increase in apoptosis. This indicates that the beneficial clinical effect of glucocorticoid therapy in EAE does not depend on the induction of increased apoptosis. The daily administration of dexamethasone for 5 days induced a relapse that commenced 5 days after cessation of treatment, with the severity of the relapse tending to increase with dexamethasone dosage.
Resumo:
Arriving in Brisbane some six years ago, I could not help being impressed by what may be prosaically described as its atmospheric amenity resources. Perhaps this in part was due to my recent experiences in major urban centres in North America, but since that time, that sparkling quality and the blue skies seem to have progressively diminished. Unfortunately, there is also objective evidence available to suggest that this apparent deterioration is not merely the result of habituation of the senses. Air pollution data for the city show trends of increasing concentrations of those very substances that have destroyed the attractiveness of major population centres elsewhere, with climates initially as salubrious. Indeed, present figures indicate that photochemical smog in unacceptably high concentrations is rapidly becoming endemic also over Brisbane. These regrettable developments should come as no surprise. The society at large has not been inclined to respond purposefully to warnings of impending environmental problems, despite the experiences and publicity from overseas and even from other cities within Australia. Nor, up to the present, have certain politicians and government officials displayed stances beyond those necessary for the maintenance of a decorum of concern. At this stage, there still exists the possibility for meaningful government action without the embarrassment of losing political favour with the electorate. To the contrary, there is every chance that such action may be turned to advantage with increased public enlightenment. It would be more than a pity to miss perhaps the final remaining opportunity: Queensland is one of the few remaining places in the world with sufficient resources to permit both rational development and high environmental quality. The choice appears to be one of making a relatively minor investment now for a large financial and social gain the near future, or, permitting Brisbane to degenerate gradually into just another stagnated Los Angeles or Sydney. The present monograph attempts to introduce the problem by reviewing the available research on air quality in the Brisbane area. It also tries to elucidate some seemingly obvious, but so far unapplied management approaches. By necessity, such a broad treatment needs to make inroads into extensive ranges of subject areas, including political and legal practices to public perceptions, scientific measurement and statistical analysis to dynamics of air flow. Clearly, it does not pretend to be definitive in any of these fields, but it does try to emphasize those adjustable facets of the human use system of natural resources, too often neglected in favour of air pollution control technology. The crossing of disciplinary boundaries, however, needs no apology: air quality problems are ubiquitous, touching upon space, time and human interaction.
Resumo:
In recent years there has been a growing recognition that many people with drug or alcohol problems are also experiencing a range of other psychiatric and psychological problems. The presence of concurrent psychiatric or psychological problems is likely to impact on the success of treatment services. These problems vary greatly, from undetected major psychiatric illnesses that meet internationally accepted diagnostic criteria such as those outlined in the Diagnostic and Statistical Manual (DSM-IV) of the American Psychiatric Association (1994), to less defined feelings of low mood and anxiety that do not meet diagnostic criteria but nevertheless impact on an individual’s sense of wellbeing and affect their quality of life.
Resumo:
Background. Human aortic valve allografts elicit a cellular and humoral immune response. It is not clear whether this is important in promoting valve damage. We investigated the changes in morphology, cell populations, and major histocompatibility complex antigen distribution in the rat aortic valve allograft. Methods. Fresh heart valves from Lewis rats were transplanted into the abdominal aorta of DA rats. Valves from allografted, isografted, and presensitized recipient rats were examined serially with standard morphologic and immunohistochemical techniques. Results. In comparison with isografts, the allografts were infiltrated and thickened by increased numbers of CD4(+) and CD8(+) lymphocytes, macrophages, and fibroblasts. Thickening of the valve wall and leaflet and the density of the cellular infiltrate was particularly evident after presensitization. Endothelial cells were frequently absent in presensitized allografts whereas isografts had intact endothelium. Cellular major histocompatibility complex class I and II antigens in the allograft were substantially increased. A long-term allograft showed dense fibrosis and disruption of the media with scattered persisting donor cells. Conclusions. The changes in these aortic valve allograft experiments are consistent with an allograft immune response and confirm that the response can damage aortic valve allograft tissue. (C) 1998 by The Society of Thoracic Surgeons.
Resumo:
The role of Ca2+ in the regulation of the cell cycle has been investigated mostly in studies assessing global cytosolic free Ca2+. Recent studies, however, have used unique techniques to assess Ca2+ in subcellular organelles, such as mitochondria, and in discrete regions of the cytoplasm. These studies have used advanced fluorescence digital imaging techniques and Ca2+-sensitive fluorescence probes, and/or targeting of Ca2+-sensitive proteins to intracellular organelles. The present review describes the results of some of these studies and the techniques used. The novel techniques used to measure Ca2+ in microdomains and intracellular organelles are likely to be of great use in future investigations assessing Ca2+ homeostasis during the cell cycle.
Resumo:
The reported experimental work on the systems Fe-Zn-O and Fe-Zn-Si-O in equilibrium with metallic iron is part of a wider research program that combines experimental and thermodynamic computer modeling techniques to characterize zinc/lead industrial slags and sinters in the system PbO-ZnO-SiO2-CaO-FeO-Fe2O3. Extensive experimental,investigations using high-temperature equilibration and quenching techniques followed by electron probe X-ray microanalysis (EPMA) were carried out. Special experimental; procedures were developed to enable accurate measurements in these ZnO-containing systems to be performed in equilibrium with metallic iron; The systems Fe-Zn-O and FeZn-Si-O were experimentally investigated in equilibrium with metallic iron in the temperature ranges 900 degreesC to 1200 degreesC (1173 to 1473 K) and from 1000 degreesC to 1350 degreesC (1273 to 1623 K), respectively. The liquidus surface in the system Fe-Zn-Si-O in equilibrium with metallic iron was characterized in the composition ranges 0 to 33 wt pet ZnO and 0 to 40 wt pet SiO2. The wustite (Fe,Zn)O, zincite (Zn,Fe)O, willemite (Zn,Fe)(2)SiO4, arid fayalite: (Fe,Zn)(2)SiO4 solid solutions in equilibrium with metallic iron were measured.
Resumo:
The majority of past and current individual-tree growth modelling methodologies have failed to characterise and incorporate structured stochastic components. Rather, they have relied on deterministic predictions or have added an unstructured random component to predictions. In particular, spatial stochastic structure has been neglected, despite being present in most applications of individual-tree growth models. Spatial stochastic structure (also called spatial dependence or spatial autocorrelation) eventuates when spatial influences such as competition and micro-site effects are not fully captured in models. Temporal stochastic structure (also called temporal dependence or temporal autocorrelation) eventuates when a sequence of measurements is taken on an individual-tree over time, and variables explaining temporal variation in these measurements are not included in the model. Nested stochastic structure eventuates when measurements are combined across sampling units and differences among the sampling units are not fully captured in the model. This review examines spatial, temporal, and nested stochastic structure and instances where each has been characterised in the forest biometry and statistical literature. Methodologies for incorporating stochastic structure in growth model estimation and prediction are described. Benefits from incorporation of stochastic structure include valid statistical inference, improved estimation efficiency, and more realistic and theoretically sound predictions. It is proposed in this review that individual-tree modelling methodologies need to characterise and include structured stochasticity. Possibilities for future research are discussed. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
The performance of three different techniques for determining proton rotating frame relaxation rates (T1pH) in charred and uncharred woods is compared. The variable contact time (VCT) experiment is shown to over-estimate T1pH, particularly for the charred samples, due to the presence of slowly cross-polarizing C-13 nuclei. The variable spin (VSL) or delayed contact experiment is shown to overcome these problems; however, care is needed in the analysis to ensure rapidly relaxing components are not overlooked. T1pH is shown to be non-uniform for both charred and uncharred wood samples; a rapidly relaxing component (T1pH = 0.46-1.07 ms) and a slowly relaxing component (T1pH = 3.58-7.49) is detected in each sample. T1pH for each component generally decreases with heating temperature (degree of charring) and the proportion of rapidly relaxing component increases. Direct T1pH determination (via H-1 detection) shows that all samples contain an even faster relaxing component (0.09-0.24 ms) that is virtually undetectable by the indirect (VCT and VSL) techniques. A new method for correcting for T1pH signal losses in spin counting experiments is developed to deal with the rapidly relaxing component detected in the VSL experiment. Implementation of this correction increased the proportion of potential C-13 CPMAS NMR signal that can be accounted for by up to 50% for the charred samples. An even greater proportion of potential signal can be accounted for if the very rapidly relaxing component detected in the direct T1pH determination is included; however, it must be kept in mind that this experiment also detects H-1 pools which may not be involved in H-1-C-13 cross-polarization. (C) 2002 Elsevier Science (USA).