30 resultados para Joint economic design
Resumo:
Solar array rotation mechanism provides a hinged joint between the solar panel and satellite body, smooth rota-tion of the solar array into deployed position and its fixation in this position. After unlocking of solar panel (while in orbit), rotation bracket turns towards ready-to-work position under the action of driving spring. During deployment, once reached the required operating angle (defined by power subsystem engineer), the rotation bracket collides with the fixed bracket that is mounted on body of the satellite, to stop rotation. Due to the effect of collision force that may alter the rotation mechanism function, design of centrifugal brake is essential. At stoppage moment micro-switches activate final position sensor and a stopper locks the rotation bracket. Design of spring and centrifugal brake components, static finite element stress analysis of primary structure body of rotation mechanism at stoppage moment have been obtained. Last, reliability analysis of rotation mechanism is evaluated. The benefit of this study is to aid in the design of rotation mechanism that can be used in micro-satellite applications.
Resumo:
Solar array rotation mechanism provides a hinged joint between the solar panel and satellite body, smooth rotation of the solar array into deployed position and its fixation in this position. After unlocking of solar panel (while in orbit), rotation bracket turns towards ready-to-work position under the action of driving spring. During deployment, once reached the required operating angle (defined by power subsystem engineer), the rotation bracket collides with the fixed bracket that is mounted on body of the satellite, to stop rotation. Due to the effect of collision force that may alter the rotation mechanism function, design of centrifugal brake is essential. At stoppage moment micro-switches activate final position sensor and a stopper locks the rotation bracket. Design of spring and centrifugal brake components, static finite element stress analysis of primary structure body of rotation mechanism at stoppage moment have been obtained. Last, reliability analysis of rotation mechanism is evaluated. The benefit of this study is to aid in the design of rotation mechanism that can be used in micro-satellite applications.
Resumo:
A joint concern with multidimensionality and dynamics is a defining feature of the pervasive use of the terminology of social exclusion in the European Union. The notion of social exclusion focuses attention on economic vulnerability in the sense of exposure to risk and uncertainty. Sociological concern with these issues has been associated with the thesis that risk and uncertainty have become more pervasive and extend substantially beyond the working class. This paper combines features of recent approaches to statistical modelling of poverty dynamics and multidimensional deprivation in order to develop our understanding of the dynamics of economic vulnerability. An analysis involving nine countries and covering the first five waves of the European Community Household Panel shows that, across nations and time, it is possible to identify an economically vulnerable class. This class is characterized by heightened risk of falling below a critical resource level, exposure to material deprivation and experience of subjective economic stress. Cross-national differentials in persistence of vulnerability are wider than in the case of income poverty and less affected by measurement error. Economic vulnerability profiles vary across welfare regimes in a manner broadly consistent with our expectations. Variation in the impact of social class within and across countries provides no support for the argument that its role in structuring such risk has become much less important. Our findings suggest that it is possible to accept the importance of the emergence of new forms of social risk and acknowledge the significance of efforts to develop welfare states policies involving a shift of opportunities and decision making on to individuals without accepting the 'death of social class' thesis.
Resumo:
OBJECTIVES: To determine effective and efficient monitoring criteria for ocular hypertension [raised intraocular pressure (IOP)] through (i) identification and validation of glaucoma risk prediction models; and (ii) development of models to determine optimal surveillance pathways.
DESIGN: A discrete event simulation economic modelling evaluation. Data from systematic reviews of risk prediction models and agreement between tonometers, secondary analyses of existing datasets (to validate identified risk models and determine optimal monitoring criteria) and public preferences were used to structure and populate the economic model.
SETTING: Primary and secondary care.
PARTICIPANTS: Adults with ocular hypertension (IOP > 21 mmHg) and the public (surveillance preferences).
INTERVENTIONS: We compared five pathways: two based on National Institute for Health and Clinical Excellence (NICE) guidelines with monitoring interval and treatment depending on initial risk stratification, 'NICE intensive' (4-monthly to annual monitoring) and 'NICE conservative' (6-monthly to biennial monitoring); two pathways, differing in location (hospital and community), with monitoring biennially and treatment initiated for a ≥ 6% 5-year glaucoma risk; and a 'treat all' pathway involving treatment with a prostaglandin analogue if IOP > 21 mmHg and IOP measured annually in the community.
MAIN OUTCOME MEASURES: Glaucoma cases detected; tonometer agreement; public preferences; costs; willingness to pay and quality-adjusted life-years (QALYs).
RESULTS: The best available glaucoma risk prediction model estimated the 5-year risk based on age and ocular predictors (IOP, central corneal thickness, optic nerve damage and index of visual field status). Taking the average of two IOP readings, by tonometry, true change was detected at two years. Sizeable measurement variability was noted between tonometers. There was a general public preference for monitoring; good communication and understanding of the process predicted service value. 'Treat all' was the least costly and 'NICE intensive' the most costly pathway. Biennial monitoring reduced the number of cases of glaucoma conversion compared with a 'treat all' pathway and provided more QALYs, but the incremental cost-effectiveness ratio (ICER) was considerably more than £30,000. The 'NICE intensive' pathway also avoided glaucoma conversion, but NICE-based pathways were either dominated (more costly and less effective) by biennial hospital monitoring or had a ICERs > £30,000. Results were not sensitive to the risk threshold for initiating surveillance but were sensitive to the risk threshold for initiating treatment, NHS costs and treatment adherence.
LIMITATIONS: Optimal monitoring intervals were based on IOP data. There were insufficient data to determine the optimal frequency of measurement of the visual field or optic nerve head for identification of glaucoma. The economic modelling took a 20-year time horizon which may be insufficient to capture long-term benefits. Sensitivity analyses may not fully capture the uncertainty surrounding parameter estimates.
CONCLUSIONS: For confirmed ocular hypertension, findings suggest that there is no clear benefit from intensive monitoring. Consideration of the patient experience is important. A cohort study is recommended to provide data to refine the glaucoma risk prediction model, determine the optimum type and frequency of serial glaucoma tests and estimate costs and patient preferences for monitoring and treatment.
FUNDING: The National Institute for Health Research Health Technology Assessment Programme.
Resumo:
Architects typically interpret Heidegger to mean that dwelling in the Black Forest, was more authentic than living in an industrialised society however we cannot turn back the clock so we are confronted with the reality of modernisation. Since the Second World War production has shifted from material to immaterial assets. Increasingly place is believed to offer resistance to this fluidity, but this belief can conversely be viewed as expressing a sublimated anxiety about our role in the world – the need to create buildings that are self-consciously contextual suggests that we may no longer be rooted in material places, but in immaterial relations.
This issue has been pondered by David Harvey in his paper From Place to Space and Back Again where he argues that the role of place in legitimising identity is ultimately a political process, as the interpretation of its meaning is dependent on whose interpretation it is. Doreen Massey has found that different classes of people are more or less mobile and that mobility is related to class and education rather than to nationality or geography. These thinkers point to a different set of questions than the usual space/place divide – how can we begin to address the economic mediation of spatial production to develop an ethical production of place? Part of the answer is provided by the French architectural practice Lacaton Vassal in their book Plus. They ask themselves how to produce more space for the same cost so that people can enjoy a better quality of life. Another French practitioner, Patrick Bouchain, has argued that architect’s fees should be inversely proportional to the amount of material resources that they consume. These approaches use economics as a starting point for generating architectural form and point to more ethical possibilities for architectural practice
Resumo:
We consider a multiple femtocell deployment in a small area which shares spectrum with the underlaid macrocell. We design a joint energy and radio spectrum scheme which aims not only for co-existence with the macrocell, but also for an energy-efficient implementation of the multi-femtocells. Particularly, aggregate energy usage on dense femtocell channels is formulated taking into account the cost of both the spectrum and energy usage. We investigate an energy-and-spectral efficient approach to balance between the two costs by varying the number of active sub-channels and their energy. The proposed scheme is addressed by deriving closed-form expressions for the interference towards the macrocell and the outage capacity. Analytically, discrete regions under which the most promising outage capacity is achieved by the same size of active sub-channels are introduced. Through a joint optimization of the sub-channels and their energy, properties can be found for the maximum outage capacity under realistic constraints. Using asymptotic and numerical analysis, it can be noticed that in a dense femtocell deployment, the optimum utilization of the energy and the spectrum to maximize the outage capacity converges towards a round-robin scheduling approach for a very small outage threshold. This is the inverse of the traditional greedy approach. © 2012 IEEE.
Resumo:
Waste management and sustainability are two core underlying philosophies that the construction sector must acknowledge and implement; however, this can prove difficult and time consuming. To this end, the aim of this paper is to examine waste management strategies and the possible benefits, advantages and disadvantages to their introduction and use, while also to examine any inter-relationship with sustainability, particularly at the design stage. The purpose of this paper is to gather, examine and review published works and investigate factors which influence economic decisions at the design phase of a construction project. In addressing this aim, a three tiered sequential research approach is adopted; in-depth literature review, interviews/focus groups and qualitative analysis. The resulting data is analyzed, discussed, with potential conclusions identified; paying particular attention to implications for practice within architectural firms. This research is of importance, particularly to the architectural sector, as it can add to the industry’s understanding of the design process, while also considering the application and integration of waste management into the design procedure. Results indicate that the researched topic had many advantages but also had inherent disadvantages. It was found that the potential advantages outweighed disadvantages, but uptake within industry was still slow and that better promotion and their benefits to; sustainability, the environment, society and the industry were required.
Resumo:
Best concrete research paper by a student - Research has shown that the cost of managing structures puts high strain on the infrastructure budget, with
estimates of over 50% of the European construction budget being dedicated to repair and maintenance. If reinforced concrete
structures are not suitably designed and adequately maintained, their service life is compromised, resulting in the full economic
value of the investment not realised. The issue is more prevalent in coastal structures as a result of combinations of aggressive
actions, such as those caused by chlorides, sulphates and cyclic freezing and thawing.
It is a common practice nowadays to ensure durability of reinforced concrete structures by specifying a concrete mix and a
nominal cover at the design stage to cater for the exposure environment. This in theory should produce the performance required
to achieve a specified service life. Although the European Standard EN 206-1 specifies variations in the exposure environment,
it does not take into account the macro and micro climates surrounding structures, which have a significant influence on their
performance and service life. Therefore, in order to construct structures which will perform satisfactorily in different exposure
environments, the following two aspects need to be developed: a performance based specification to supplement EN 206-1
which will outline the expected performance of the structure in a given environment; and a simple yet transferrable procedure
for assessing the performance of structures in service termed KPI Theory. This will allow the asset managers not only to design
structures for the intended service life, but also to take informed maintenance decisions should the performance in service fall
short of what was specified. This paper aims to discuss this further.
Resumo:
The initial composition of acrylic bone cement along with the mixing and delivery technique used can influence its final properties and therefore its clinical success in vivo. The polymerisation of acrylic bone cement is complex with a number of processes happening simultaneously. Acrylic bone cement mixing and delivery systems have undergone several design changes in their advancement, although the cement constituents themselves have remained unchanged since they were first used. This study was conducted to determine the factors that had the greatest effect on the final properties of acrylic bone cement using a pre-filled bone cement mixing and delivery system. A design of experiments (DoE) approach was used to determine the impact of the factors associated with this mixing and delivery method on the final properties of the cement produced. The DoE illustrated that all factors present within this study had a significant impact on the final properties of the cement. An optimum cement composition was hypothesised and tested. This optimum recipe produced cement with final mechanical and thermal properties within the clinical guidelines and stated by ISO 5833 (International Standard Organisation (ISO), International standard 5833: implants for surgery—acrylic resin cements, 2002), however the low setting times observed would not be clinically viable and could result in complications during the surgical technique. As a result further development would be required to improve the setting time of the cement in order for it to be deemed suitable for use in total joint replacement surgery.
Resumo:
Environmental problems, especially climate change, have become a serious global issue waiting for people to solve. In the construction industry, the concept of sustainable building is developing to reduce greenhouse gas emissions. In this study, a building information modeling (BIM) based building design optimization method is proposed to facilitate designers to optimize their designs and improve buildings’ sustainability. A revised particle swarm optimization (PSO) algorithm is applied to search for the trade-off between life cycle costs (LCC) and life cycle carbon emissions (LCCE) of building designs. In order tovalidate the effectiveness and efficiency of this method, a case study of an office building is conducted in Hong Kong. The result of the case study shows that this method can enlarge the searching space for optimal design solutions and shorten the processing time for optimal design results, which is really helpful for designers to deliver an economic and environmental friendly design scheme.
Resumo:
Demand Side Management (DSM) plays an important role in Smart Grid. It has large scale access points, massive users, heterogeneous infrastructure and dispersive participants. Moreover, cloud computing which is a service model is characterized by resource on-demand, high reliability and large scale integration and so on and the game theory is a useful tool to the dynamic economic phenomena. In this study, a scheme design of cloud + end technology is proposed to solve technical and economic problems of the DSM. The architecture of cloud + end is designed to solve technical problems in the DSM. In particular, a construct model of cloud + end is presented to solve economic problems in the DSM based on game theories. The proposed method is tested on a DSM cloud + end public service system construction in a city of southern China. The results demonstrate the feasibility of these integrated solutions which can provide a reference for the popularization and application of the DSM in china.
Resumo:
To value something, you first have to know what it is. Bartkowski et al. (2015) reveal a critical weakness: that biodiversity has rarely, if ever, been defined in economic valuations of putative biodiversity. Here we argue that a precise definition is available and could help focus valuation studies, but that in using this scientific definition (a three-dimensional measure of total difference), valuation by stated-preference methods becomes, at best, very difficult.We reclassify the valuation studies reviewed by Bartkowski et al. (2015) to better reflect the biological definition of biodiversity and its potential indirect use value as the support for provisioning and regulating services. Our analysis shows that almost all of the studies reviewed by Bartkowski et al. (2015) were not about biodiversity, but rather were about the 'vague notion' of naturalness, or sometimes a specific biological component of diversity. Alternative economic methods should be found to value biodiversity as it is defined in natural science. We suggest options based on a production function analogy or cost-based methods. Particularly the first of these provides a strong link between economic theory and ecological research and is empirically practical. Since applied science emphasizes a scientific definition of biodiversity in the design and justification of conservation plans, the need for economic valuation of this quantitative meaning of biodiversity is considerable and as yet unfulfilled.
Resumo:
The conversion of biomass for the production of liquid fuels can help reduce the greenhouse gas (GHG) emissions that are predominantly generated by the combustion of fossil fuels. Oxymethylene ethers (OMEs) are a series of liquid fuel additives that can be obtained from syngas, which is produced from the gasification of biomass. The blending of OMEs in conventional diesel fuel can reduce soot formation during combustion in a diesel engine. In this research, a process for the production of OMEs from woody biomass has been simulated. The process consists of several unit operations including biomass gasifi- cation, syngas cleanup, methanol production, and conversion of methanol to OMEs. The methodology involved the development of process models, the identification of the key process parameters affecting OME production based on the process model, and the development of an optimal process design for high OME yields. It was found that up to 9.02 tonnes day1 of OME3, OME4, and OME5 (which are suitable as diesel additives) can be produced from 277.3 tonnes day1 of wet woody biomass. Furthermore, an optimal combination of the parameters, which was generated from the developed model, can greatly enhance OME production and thermodynamic efficiency. This model can further be used in a techno- economic assessment of the whole biomass conversion chain to produce OMEs. The results of this study can be helpful for petroleum-based fuel producers and policy makers in determining the most attractive pathways of converting bio-resources into liquid fuels.
Resumo:
Laser transmission joining (LTJ) is growing in importance, and has the potential to become a niche technique for the fabrication of hybrid plastic-metal joints for medical device applications. The possibility of directly joining plastics to metals by LTJ has been demonstrated by a number of recent studies. However, a reliable and quantitative method for defining the contact area between the plastic and metal, facilitating calculation of the mechanical shear stress of the hybrid joints, is still lacking. A new method, based on image analysis using ImageJ, is proposed here to quantify the contact area at the joint interface. The effect of discolouration on the mechanical performance of the hybrid joints is also reported for the first time. Biocompatible polyethylene terephthalate (PET) and commercially pure titanium (Ti) were selected as materials for laser joining using a 200 W CW fibre laser system. The effect of laser power, scanning speed and stand-off distance between the nozzle tip and top surface of the plastic were studied and analysed by Taguchi L9 orthogonal array and ANOVA respectively. The surface morphology, structure and elemental composition on the PET and Ti surfaces after shearing/peeling apart were characterized by SEM, EDX, XRD and XPS.
Resumo:
In this paper, we consider the secure beamforming design for an underlay cognitive radio multiple-input singleoutput broadcast channel in the presence of multiple passive eavesdroppers. Our goal is to design a jamming noise (JN) transmit strategy to maximize the secrecy rate of the secondary system. By utilizing the zero-forcing method to eliminate the interference caused by JN to the secondary user, we study the joint optimization of the information and JN beamforming for secrecy rate maximization of the secondary system while satisfying all the interference power constraints at the primary users, as well as the per-antenna power constraint at the secondary transmitter. For an optimal beamforming design, the original problem is a nonconvex program, which can be reformulated as a convex program by applying the rank relaxation method. To this end, we prove that the rank relaxation is tight and propose a barrier interior-point method to solve the resulting saddle point problem based on a duality result. To find the global optimal solution, we transform the considered problem into an unconstrained optimization problem. We then employ Broyden-Fletcher-Goldfarb-Shanno (BFGS) method to solve the resulting unconstrained problem which helps reduce the complexity significantly, compared to conventional methods. Simulation results show the fast convergence of the proposed algorithm and substantial performance improvements over existing approaches.