942 resultados para Auto-Regressive and Moving-Average Model with exogenous inputs
Resumo:
Im Jahr 2011 wurde am Large Hadron Collider mit dem ATLAS Experiment ein Datensatz von 4.7 inversen Femtobarn bei einer Schwerpunktsenergie von 7 TeV aufgezeichnet. Teil des umfangreichen Physikprogrammes des ATLAS Experiments ist die Suche nach Physik jenseits des Standardmodells. Supersymmetrie - eine neue Symmetrie zwischen Bosonen und Fermionen - wird als aussichtsreichester Kandidat für neue Physik angesehen, und zahlreiche direkte und indirekte Suchen nach Supersymmetrie wurden in den letzten Jahrzehnten bereits durchgeführt. In der folgenden Arbeit wird eine direkte Suche nach Supersymmetrie in Endzuständen mit Jets, fehlender Transversalenergie und genau einem Elektron oder Myon durchgeführt. Der analysierte Datensatz von 4.7 inversen Femtobarn umfasst die gesamte Datenmenge, welche am ATLAS Experiment bei einer Schwerpunktsenergie von 7 TeV aufgezeichnet wurde. Die Ergebnisse der Analyse werden mit verschiedenen anderen leptonischen Suchkanälen kombiniert, um die Sensitivität auf diversen supersymmetrischen Produktions- und Zerfallsmodi zu maximieren. Die gemessenen Daten sind kompatibel mit der Standardmodellerwartung, und neue Ausschlussgrenzen in verschiedenen supersymmetrischen Modellen werden berechnet.
Resumo:
A triple cell co-culture model was recently established by the authors, consisting of either A549 or 16HBE14o- epithelial cells, human blood monocyte-derived macrophages and dendritic cells, which offers the possibility to study the interaction of xenobiotics with those cells. The 16HBE14o- containing co-culture model mimics the airway epithelial barrier, whereas the A549 co-cultures mimic the alveolar type II-like epithelial barrier. The goal of the present work was to establish a new triple cell co-culture model composed of primary alveolar type I-like cells isolated from human lung biopsies (hAEpC) representing a more realistic alveolar epithelial barrier wall, since type I epithelial cells cover >93% of the alveolar surface. Monocultures of A549 and 16HBE14o- were morphologically and functionally compared with the hAEpC using laser scanning microscopy, as well as transmission electron microscopy, and by determining the epithelial integrity. The triple cell co-cultures were characterized using the same methods. It could be shown that the epithelial integrity of hAEpC (mean ± SD, 1180 ± 188 Ω cm(2)) was higher than in A549 (172 ± 59 Ω cm(2)) but similar to 16HBE14o- cells (1469 ± 156 Ω cm(2)). The triple cell co-culture model with hAEpC (1113 ± 30 Ω cm(2)) showed the highest integrity compared to the ones with A549 (93 ± 14 Ω cm(2)) and 16HBE14o- (558 ± 267 Ω cm(2)). The tight junction protein zonula occludens-1 in hAEpC and 16HBE14o- were more regularly expressed but not in A549. The epithelial alveolar model with hAEpC combined with two immune cells (i.e. macrophages and dendritic cells) will offer a novel and more realistic cell co-culture system to study possible cell interactions of inhaled xenobiotics and their toxic potential on the human alveolar type I epithelial wall.
Resumo:
This paper utilizes a Contingent Valuation Method survey of a random sample of residents to estimate that households are willing to pay an average of $12.00 per month for public projects designed to improve river access and $10.46 per month for additional safety measures that would eliminate risks to local watersheds from drilling for natural gas from underground shale formations. These estimates can be compared to the costs of providing each of these two amenities to help foster the formation of efficient policy decisions.
Resumo:
Background—Pathology studies on fatal cases of very late stent thrombosis have described incomplete neointimal coverage as common substrate, in some cases appearing at side-branch struts. Intravascular ultrasound studies have described the association between incomplete stent apposition (ISA) and stent thrombosis, but the mechanism explaining this association remains unclear. Whether the neointimal coverage of nonapposed side-branch and ISA struts is delayed with respect to well-apposed struts is unknown. Methods and Results—Optical coherence tomography studies from 178 stents implanted in 99 patients from 2 randomized trials were analyzed at 9 to 13 months of follow-up. The sample included 38 sirolimus-eluting, 33 biolimus-eluting, 57 everolimus-eluting, and 50 zotarolimus-eluting stents. Optical coherence tomography coverage of nonapposed side-branch and ISA struts was compared with well-apposed struts of the same stent by statistical pooled analysis with a random-effects model. A total of 34 120 struts were analyzed. The risk ratio of delayed coverage was 9.00 (95% confidence interval, 6.58 to 12.32) for nonapposed side-branch versus well-apposed struts, 9.10 (95% confidence interval, 7.34 to 11.28) for ISA versus well-apposed struts, and 1.73 (95% confidence interval, 1.34 to 2.23) for ISA versus nonapposed side-branch struts. Heterogeneity of the effect was observed in the comparison of ISA versus well-apposed struts (H=1.27; I2=38.40) but not in the other comparisons. Conclusions—Coverage of ISA and nonapposed side-branch struts is delayed with respect to well-apposed struts in drug-eluting stents, as assessed by optical coherence tomography.
Resumo:
This Letter presents the first search for supersymmetry in final states containing one isolated electron or muon, jets, and missing transverse momentum from √s=7 TeV proton-proton collisions at the LHC. The data were recorded by the ATLAS experiment during 2010 and correspond to a total integrated luminosity of 35 pb(-1). No excess above the standard model background expectation is observed. Limits are set on the parameters of the minimal supergravity framework, extending previous limits. Within this framework, for A(0)=0 GeV, tanβ=3, and μ>0 and for equal squark and gluino masses, gluino masses below 700 GeV are excluded at 95% confidence level.
Resumo:
In business literature, the conflicts among workers, shareholders and the management have been studied mostly in the frame of stakeholder theory. The stakeholder theory recognizes this issue as an agency problem, and tries to solve the problem by establishing a contractual relationship between the agent and principals. However, as Marcoux pointed out, the appropriateness of the contract as a medium to reduce the agency problem should be questioned. As an alternative, the cooperative model minimizes the agency costs by integrating the concept of workers, owners and management. Mondragon Corporation is a successful example of the cooperative model which grew into the sixth largest corporation in Spain. However, the cooperative model has long been ignored in discussions of corporate governance, mainly because the success of the cooperative model is extremely difficult to duplicate in reality. This thesis hopes to revitalize the scholarly examination of cooperatives by developing a new model that overcomes the fundamental problem in the cooperative model: the limited access to capital markets. By dividing the ownership interest into financial and control interest, the dual ownership structure allows cooperatives to issue stock in the capital market by making a financial product out of financial interest.
Resumo:
Despite the widespread popularity of linear models for correlated outcomes (e.g. linear mixed modesl and time series models), distribution diagnostic methodology remains relatively underdeveloped in this context. In this paper we present an easy-to-implement approach that lends itself to graphical displays of model fit. Our approach involves multiplying the estimated marginal residual vector by the Cholesky decomposition of the inverse of the estimated marginal variance matrix. Linear functions or the resulting "rotated" residuals are used to construct an empirical cumulative distribution function (ECDF), whose stochastic limit is characterized. We describe a resampling technique that serves as a computationally efficient parametric bootstrap for generating representatives of the stochastic limit of the ECDF. Through functionals, such representatives are used to construct global tests for the hypothesis of normal margional errors. In addition, we demonstrate that the ECDF of the predicted random effects, as described by Lange and Ryan (1989), can be formulated as a special case of our approach. Thus, our method supports both omnibus and directed tests. Our method works well in a variety of circumstances, including models having independent units of sampling (clustered data) and models for which all observations are correlated (e.g., a single time series).
Resumo:
Despite the widespread popularity of linear models for correlated outcomes (e.g. linear mixed models and time series models), distribution diagnostic methodology remains relatively underdeveloped in this context. In this paper we present an easy-to-implement approach that lends itself to graphical displays of model fit. Our approach involves multiplying the estimated margional residual vector by the Cholesky decomposition of the inverse of the estimated margional variance matrix. The resulting "rotated" residuals are used to construct an empirical cumulative distribution function and pointwise standard errors. The theoretical framework, including conditions and asymptotic properties, involves technical details that are motivated by Lange and Ryan (1989), Pierce (1982), and Randles (1982). Our method appears to work well in a variety of circumstances, including models having independent units of sampling (clustered data) and models for which all observations are correlated (e.g., a single time series). Our methods can produce satisfactory results even for models that do not satisfy all of the technical conditions stated in our theory.
Resumo:
Generalized linear mixed models with semiparametric random effects are useful in a wide variety of Bayesian applications. When the random effects arise from a mixture of Dirichlet process (MDP) model, normal base measures and Gibbs sampling procedures based on the Pólya urn scheme are often used to simulate posterior draws. These algorithms are applicable in the conjugate case when (for a normal base measure) the likelihood is normal. In the non-conjugate case, the algorithms proposed by MacEachern and Müller (1998) and Neal (2000) are often applied to generate posterior samples. Some common problems associated with simulation algorithms for non-conjugate MDP models include convergence and mixing difficulties. This paper proposes an algorithm based on the Pólya urn scheme that extends the Gibbs sampling algorithms to non-conjugate models with normal base measures and exponential family likelihoods. The algorithm proceeds by making Laplace approximations to the likelihood function, thereby reducing the procedure to that of conjugate normal MDP models. To ensure the validity of the stationary distribution in the non-conjugate case, the proposals are accepted or rejected by a Metropolis-Hastings step. In the special case where the data are normally distributed, the algorithm is identical to the Gibbs sampler.
Resumo:
The demands in production and associate costs at power generation through non renewable resources are increasing at an alarming rate. Solar energy is one of the renewable resource that has the potential to minimize this increase. Utilization of solar energy have been concentrated mainly on heating application. The use of solar energy in cooling systems in building would benefit greatly achieving the goal of non-renewable energy minimization. The approaches of solar energy heating system research done by initiation such as University of Wisconsin at Madison and building heat flow model research conducted by Oklahoma State University can be used to develop and optimize solar cooling building system. The research uses two approaches to develop a Graphical User Interface (GUI) software for an integrated solar absorption cooling building model, which is capable of simulating and optimizing the absorption cooling system using solar energy as the main energy source to drive the cycle. The software was then put through a number of litmus test to verify its integrity. The litmus test was conducted on various building cooling system data sets of similar applications around the world. The output obtained from the software developed were identical with established experimental results from the data sets used. Software developed by other research are catered for advanced users. The software developed by this research is not only reliable in its code integrity but also through its integrated approach which is catered for new entry users. Hence, this dissertation aims to correctly model a complete building with the absorption cooling system in appropriate climate as a cost effective alternative to conventional vapor compression system.
Resumo:
To mitigate greenhouse gas (GHG) emissions and reduce U.S. dependence on imported oil, the United States (U.S.) is pursuing several options to create biofuels from renewable woody biomass (hereafter referred to as “biomass”). Because of the distributed nature of biomass feedstock, the cost and complexity of biomass recovery operations has significant challenges that hinder increased biomass utilization for energy production. To facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization and tapping unused forest residues, it is proposed to develop biofuel supply chain models based on optimization and simulation approaches. The biofuel supply chain is structured around four components: biofuel facility locations and sizes, biomass harvesting/forwarding, transportation, and storage. A Geographic Information System (GIS) based approach is proposed as a first step for selecting potential facility locations for biofuel production from forest biomass based on a set of evaluation criteria, such as accessibility to biomass, railway/road transportation network, water body and workforce. The development of optimization and simulation models is also proposed. The results of the models will be used to determine (1) the number, location, and size of the biofuel facilities, and (2) the amounts of biomass to be transported between the harvesting areas and the biofuel facilities over a 20-year timeframe. The multi-criteria objective is to minimize the weighted sum of the delivered feedstock cost, energy consumption, and GHG emissions simultaneously. Finally, a series of sensitivity analyses will be conducted to identify the sensitivity of the decisions, such as the optimal site selected for the biofuel facility, to changes in influential parameters, such as biomass availability and transportation fuel price. Intellectual Merit The proposed research will facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization in the renewable biofuel industry. The GIS-based facility location analysis considers a series of factors which have not been considered simultaneously in previous research. Location analysis is critical to the financial success of producing biofuel. The modeling of woody biomass supply chains using both optimization and simulation, combing with the GIS-based approach as a precursor, have not been done to date. The optimization and simulation models can help to ensure the economic and environmental viability and sustainability of the entire biofuel supply chain at both the strategic design level and the operational planning level. Broader Impacts The proposed models for biorefineries can be applied to other types of manufacturing or processing operations using biomass. This is because the biomass feedstock supply chain is similar, if not the same, for biorefineries, biomass fired or co-fired power plants, or torrefaction/pelletization operations. Additionally, the research results of this research will continue to be disseminated internationally through publications in journals, such as Biomass and Bioenergy, and Renewable Energy, and presentations at conferences, such as the 2011 Industrial Engineering Research Conference. For example, part of the research work related to biofuel facility identification has been published: Zhang, Johnson and Sutherland [2011] (see Appendix A). There will also be opportunities for the Michigan Tech campus community to learn about the research through the Sustainable Future Institute.
Resumo:
A range of societal issues have been caused by fossil fuel consumption in the transportation sector in the United States (U.S.), including health related air pollution, climate change, the dependence on imported oil, and other oil related national security concerns. Biofuels production from various lignocellulosic biomass types such as wood, forest residues, and agriculture residues have the potential to replace a substantial portion of the total fossil fuel consumption. This research focuses on locating biofuel facilities and designing the biofuel supply chain to minimize the overall cost. For this purpose an integrated methodology was proposed by combining the GIS technology with simulation and optimization modeling methods. The GIS based methodology was used as a precursor for selecting biofuel facility locations by employing a series of decision factors. The resulted candidate sites for biofuel production served as inputs for simulation and optimization modeling. As a precursor to simulation or optimization modeling, the GIS-based methodology was used to preselect potential biofuel facility locations for biofuel production from forest biomass. Candidate locations were selected based on a set of evaluation criteria, including: county boundaries, a railroad transportation network, a state/federal road transportation network, water body (rivers, lakes, etc.) dispersion, city and village dispersion, a population census, biomass production, and no co-location with co-fired power plants. The simulation and optimization models were built around key supply activities including biomass harvesting/forwarding, transportation and storage. The built onsite storage served for spring breakup period where road restrictions were in place and truck transportation on certain roads was limited. Both models were evaluated using multiple performance indicators, including cost (consisting of the delivered feedstock cost, and inventory holding cost), energy consumption, and GHG emissions. The impact of energy consumption and GHG emissions were expressed in monetary terms to keep consistent with cost. Compared with the optimization model, the simulation model represents a more dynamic look at a 20-year operation by considering the impacts associated with building inventory at the biorefinery to address the limited availability of biomass feedstock during the spring breakup period. The number of trucks required per day was estimated and the inventory level all year around was tracked. Through the exchange of information across different procedures (harvesting, transportation, and biomass feedstock processing procedures), a smooth flow of biomass from harvesting areas to a biofuel facility was implemented. The optimization model was developed to address issues related to locating multiple biofuel facilities simultaneously. The size of the potential biofuel facility is set up with an upper bound of 50 MGY and a lower bound of 30 MGY. The optimization model is a static, Mathematical Programming Language (MPL)-based application which allows for sensitivity analysis by changing inputs to evaluate different scenarios. It was found that annual biofuel demand and biomass availability impacts the optimal results of biofuel facility locations and sizes.
Resumo:
In this thesis, we consider Bayesian inference on the detection of variance change-point models with scale mixtures of normal (for short SMN) distributions. This class of distributions is symmetric and thick-tailed and includes as special cases: Gaussian, Student-t, contaminated normal, and slash distributions. The proposed models provide greater flexibility to analyze a lot of practical data, which often show heavy-tail and may not satisfy the normal assumption. As to the Bayesian analysis, we specify some prior distributions for the unknown parameters in the variance change-point models with the SMN distributions. Due to the complexity of the joint posterior distribution, we propose an efficient Gibbs-type with Metropolis- Hastings sampling algorithm for posterior Bayesian inference. Thereafter, following the idea of [1], we consider the problems of the single and multiple change-point detections. The performance of the proposed procedures is illustrated and analyzed by simulation studies. A real application to the closing price data of U.S. stock market has been analyzed for illustrative purposes.