39 resultados para economic statistical design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives

A P-value <0.05 is one metric used to evaluate the results of a randomized controlled trial (RCT). We wondered how often statistically significant results in RCTs may be lost with small changes in the numbers of outcomes.

Study Design and Setting

A review of RCTs in high-impact medical journals that reported a statistically significant result for at least one dichotomous or time-to-event outcome in the abstract. In the group with the smallest number of events, we changed the status of patients without an event to an event until the P-value exceeded 0.05. We labeled this number the Fragility Index; smaller numbers indicated a more fragile result.

Results

The 399 eligible trials had a median sample size of 682 patients (range: 15-112,604) and a median of 112 events (range: 8-5,142); 53% reported a P-value <0.01. The median Fragility Index was 8 (range: 0-109); 25% had a Fragility Index of 3 or less. In 53% of trials, the Fragility Index was less than the number of patients lost to follow-up.

Conclusion

The statistically significant results of many RCTs hinge on small numbers of events. The Fragility Index complements the P-value and helps identify less robust results. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The adulteration of extra virgin olive oil with other vegetable oils is a certain problem with economic and health consequences. Current official methods have been proved insufficient to detect such adulterations. One of the most concerning and undetectable adulterations with other vegetable oils is the addition of hazelnut oil. The main objective of this work was to develop a novel dimensionality reduction technique able to model oil mixtures as a part of an integrated pattern recognition solution. This final solution attempts to identify hazelnut oil adulterants in extra virgin olive oil at low percentages based on spectroscopic chemical fingerprints. The proposed Continuous Locality Preserving Projections (CLPP) technique allows the modelling of the continuous nature of the produced in house admixtures as data series instead of discrete points. This methodology has potential to be extended to other mixtures and adulterations of food products. The maintenance of the continuous structure of the data manifold lets the better visualization of this examined classification problem and facilitates a more accurate utilisation of the manifold for detecting the adulterants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Best concrete research paper by a student - Research has shown that the cost of managing structures puts high strain on the infrastructure budget, with
estimates of over 50% of the European construction budget being dedicated to repair and maintenance. If reinforced concrete
structures are not suitably designed and adequately maintained, their service life is compromised, resulting in the full economic
value of the investment not realised. The issue is more prevalent in coastal structures as a result of combinations of aggressive
actions, such as those caused by chlorides, sulphates and cyclic freezing and thawing.
It is a common practice nowadays to ensure durability of reinforced concrete structures by specifying a concrete mix and a
nominal cover at the design stage to cater for the exposure environment. This in theory should produce the performance required
to achieve a specified service life. Although the European Standard EN 206-1 specifies variations in the exposure environment,
it does not take into account the macro and micro climates surrounding structures, which have a significant influence on their
performance and service life. Therefore, in order to construct structures which will perform satisfactorily in different exposure
environments, the following two aspects need to be developed: a performance based specification to supplement EN 206-1
which will outline the expected performance of the structure in a given environment; and a simple yet transferrable procedure
for assessing the performance of structures in service termed KPI Theory. This will allow the asset managers not only to design
structures for the intended service life, but also to take informed maintenance decisions should the performance in service fall
short of what was specified. This paper aims to discuss this further.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several one-dimensional design methods have been used to predict the off-design performance of three modern centrifugal compressors for automotive turbocharging. The three methods used are single-zone, two-zone, and a more recent statistical method. The predicted results from each method are compared against empirical data taken from standard hot gas stand tests for each turbocharger. Each of the automotive turbochargers considered in this study have notably different geometries and are of varying application. Due to the non-adiabatic test conditions, the empirical data has been corrected for the effect of heat transfer to ensure comparability with the 1D models. Each method is evaluated for usability and accuracy in both pressure ratio and efficiency prediction. The paper presents an insight into the limitations of each of these models when applied to one-dimensional automotive turbocharger design, and proposes that a corrected single-zone modelling approach has the greatest potential for further development, whilst the statistical method could be immediately introduced to a design process where design variations are limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environmental problems, especially climate change, have become a serious global issue waiting for people to solve. In the construction industry, the concept of sustainable building is developing to reduce greenhouse gas emissions. In this study, a building information modeling (BIM) based building design optimization method is proposed to facilitate designers to optimize their designs and improve buildings’ sustainability. A revised particle swarm optimization (PSO) algorithm is applied to search for the trade-off between life cycle costs (LCC) and life cycle carbon emissions (LCCE) of building designs. In order tovalidate the effectiveness and efficiency of this method, a case study of an office building is conducted in Hong Kong. The result of the case study shows that this method can enlarge the searching space for optimal design solutions and shorten the processing time for optimal design results, which is really helpful for designers to deliver an economic and environmental friendly design scheme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Demand Side Management (DSM) plays an important role in Smart Grid. It has large scale access points, massive users, heterogeneous infrastructure and dispersive participants. Moreover, cloud computing which is a service model is characterized by resource on-demand, high reliability and large scale integration and so on and the game theory is a useful tool to the dynamic economic phenomena. In this study, a scheme design of cloud + end technology is proposed to solve technical and economic problems of the DSM. The architecture of cloud + end is designed to solve technical problems in the DSM. In particular, a construct model of cloud + end is presented to solve economic problems in the DSM based on game theories. The proposed method is tested on a DSM cloud + end public service system construction in a city of southern China. The results demonstrate the feasibility of these integrated solutions which can provide a reference for the popularization and application of the DSM in china.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The worsening of process variations and the consequent increased spreads in circuit performance and consumed power hinder the satisfaction of the targeted budgets and lead to yield loss. Corner based design and adoption of design guardbands might limit the yield loss. However, in many cases such methods may not be able to capture the real effects which might be way better than the predicted ones leading to increasingly pessimistic designs. The situation is even more severe in memories which consist of substantially different individual building blocks, further complicating the accurate analysis of the impact of variations at the architecture level leaving many potential issues uncovered and opportunities unexploited. In this paper, we develop a framework for capturing non-trivial statistical interactions among all the components of a memory/cache. The developed tool is able to find the optimum memory/cache configuration under various constraints allowing the designers to make the right choices early in the design cycle and consequently improve performance, energy, and especially yield. Our, results indicate that the consideration of the architectural interactions between the memory components allow to relax the pessimistic access times that are predicted by existing techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To value something, you first have to know what it is. Bartkowski et al. (2015) reveal a critical weakness: that biodiversity has rarely, if ever, been defined in economic valuations of putative biodiversity. Here we argue that a precise definition is available and could help focus valuation studies, but that in using this scientific definition (a three-dimensional measure of total difference), valuation by stated-preference methods becomes, at best, very difficult.We reclassify the valuation studies reviewed by Bartkowski et al. (2015) to better reflect the biological definition of biodiversity and its potential indirect use value as the support for provisioning and regulating services. Our analysis shows that almost all of the studies reviewed by Bartkowski et al. (2015) were not about biodiversity, but rather were about the 'vague notion' of naturalness, or sometimes a specific biological component of diversity. Alternative economic methods should be found to value biodiversity as it is defined in natural science. We suggest options based on a production function analogy or cost-based methods. Particularly the first of these provides a strong link between economic theory and ecological research and is empirically practical. Since applied science emphasizes a scientific definition of biodiversity in the design and justification of conservation plans, the need for economic valuation of this quantitative meaning of biodiversity is considerable and as yet unfulfilled.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The conversion of biomass for the production of liquid fuels can help reduce the greenhouse gas (GHG) emissions that are predominantly generated by the combustion of fossil fuels. Oxymethylene ethers (OMEs) are a series of liquid fuel additives that can be obtained from syngas, which is produced from the gasification of biomass. The blending of OMEs in conventional diesel fuel can reduce soot formation during combustion in a diesel engine. In this research, a process for the production of OMEs from woody biomass has been simulated. The process consists of several unit operations including biomass gasifi- cation, syngas cleanup, methanol production, and conversion of methanol to OMEs. The methodology involved the development of process models, the identification of the key process parameters affecting OME production based on the process model, and the development of an optimal process design for high OME yields. It was found that up to 9.02 tonnes day1 of OME3, OME4, and OME5 (which are suitable as diesel additives) can be produced from 277.3 tonnes day1 of wet woody biomass. Furthermore, an optimal combination of the parameters, which was generated from the developed model, can greatly enhance OME production and thermodynamic efficiency. This model can further be used in a techno- economic assessment of the whole biomass conversion chain to produce OMEs. The results of this study can be helpful for petroleum-based fuel producers and policy makers in determining the most attractive pathways of converting bio-resources into liquid fuels.