849 resultados para Energy based approach
Resumo:
The implementation of effective time analysis methods fast and accurately in the era of digital manufacturing has become a significant challenge for aerospace manufacturers hoping to build and maintain a competitive advantage. This paper proposes a structure oriented, knowledge-based approach for intelligent time analysis of aircraft assembly processes within a digital manufacturing framework. A knowledge system is developed so that the design knowledge can be intelligently retrieved for implementing assembly time analysis automatically. A time estimation method based on MOST, is reviewed and employed. Knowledge capture, transfer and storage within the digital manufacturing environment are extensively discussed. Configured plantypes, GUIs and functional modules are designed and developed for the automated time analysis. An exemplar study using an aircraft panel assembly from a regional jet is also presented. Although the method currently focuses on aircraft assembly, it can also be well utilized in other industry sectors, such as transportation, automobile and shipbuilding. The main contribution of the work is to present a methodology that facilitates the integration of time analysis with design and manufacturing using a digital manufacturing platform solution.
Resumo:
Background: The underlying pathways that drive retinal neurogenesis and synaptogenesis are still relatively poorly understood. Protein expression analysis can provide direct insight into these complex developmental processes. The aim of this study was therefore to employ proteomic analysis to study the developing chick retina throughout embryonic (E) development commencing at day 12 through 13, 17, 19 and post-hatch (P) 1 and 33 days.
Results: 2D proteomic and mass spectrometric analysis detected an average of 1514 spots per gel with 15 spots demonstrating either modulation or constitutive expression identified via MS. Proteins identified included alpha and beta-tubulin, alpha enolase, B-creatine kinase, gamma-actin, platelet-activating factor (PAF), PREDICTED: similar to TGF-beta interacting protein 1, capping protein (actin filament muscle Z line), nucleophosmin 1 (NPM1), dimethylarginine dimethylaminohydrolase, triosphoaphate isomerase, DJ1, stathmin, fatty acid binding protein 7 (FABP7/B-FABP), beta-synuclein and enhancer of rudimentary homologue.
Conclusion: This study builds upon previous proteomic investigations of retinal development and represents the addition of a unique data set to those previously reported. Based on reported bioactivity some of the identified proteins are most likely to be important to normal retinal development in the chick. Continued analysis of the dynamic protein populations present at the early stages and throughout retinal development will increase our understanding of the molecular events underpinning retinogenesis.
Resumo:
Matrix metalloproteinase-3 (MMP-3) has been proposed as an important mediator of the atherosclerotic process. The possible role of the functional -1612(.)5A/6A polymorphism of the MMP-3 gene in the susceptibility to ischaemic heart disease (IHD) was investigated in a well-defined Irish population using two recently described family based tests of association. One thousand and twelve individuals from 386 families with at least one member prematurely affected with IHD were genotyped. Using the combined transmission disequilibrium test (TDT)/sib-TDT and the pedigree disequilibrium test (PDT), no association between the MMP-3 -1612 5A/6A polymorphism and IHD was found. Our data demonstrate that, in an Irish population, the MMP-3 -1612 5A/6A polymorphism is not associated with IHD.
Resumo:
A Web-service based approach is presented which enables geographically dispersed users to share software resources over the Internet. A service-oriented software sharing system has been developed, which consists of shared applications, client applications and three types of services: application proxy service, proxy implementation service and application manager service. With the aids of the services, the client applications interact with the shared applications to implement a software sharing task. The approach satisfies the requirements of copyright protection and reuse of legacy codes. In this paper, the role of Web-services and the architecture of the system are presented first, followed by a case study to illustrate the approach developed.
Resumo:
This article provides a rationale for and insight into an explicit children's rights-based approach to the identification of outcomes for proposed educational interventions. It presents a critical reflection on a research project which sought to integrate international children's rights standards into the design of services through a children's rights audit of potential outcomes and the meaningful engagement of children in the research and service design processes. While children are involved increasingly as co-researchers in qualitative studies, it is less common for this to occur in quantitative studies. This article offers some additional insight into children's participation in the interpretation of data from a large-scale baseline survey. The article concludes with an argument that international children's rights law provides not just a legal imperative but also a comprehensive framework with which to assert the case for increased recognition of children as salient stakeholders in all aspects of service design.
Resumo:
Measuring the degree of inconsistency of a belief base is an important issue in many real world applications. It has been increasingly recognized that deriving syntax sensitive inconsistency measures for a belief base from its minimal inconsistent subsets is a natural way forward. Most of the current proposals along this line do not take the impact of the size of each minimal inconsistent subset into account. However, as illustrated by the well-known Lottery Paradox, as the size of a minimal inconsistent subset increases, the degree of its inconsistency decreases. Another lack in current studies in this area is about the role of free formulas of a belief base in measuring the degree of inconsistency. This has not yet been characterized well. Adding free formulas to a belief base can enlarge the set of consistent subsets of that base. However, consistent subsets of a belief base also have an impact on the syntax sensitive normalized measures of the degree of inconsistency, the reason for this is that each consistent subset can be considered as a distinctive plausible perspective reflected by that belief base,whilst eachminimal inconsistent subset projects a distinctive viewof the inconsistency. To address these two issues,we propose a normalized framework formeasuring the degree of inconsistency of a belief base which unifies the impact of both consistent subsets and minimal inconsistent subsets. We also show that this normalized framework satisfies all the properties deemed necessary by common consent to characterize an intuitively satisfactory measure of the degree of inconsistency for belief bases. Finally, we use a simple but explanatory example in equirements engineering to illustrate the application of the normalized framework.
Resumo:
Different economic valuation methodologies can be used to value the non-market benefits of an agri-environmental scheme. In particular, the non-market value can be examined by assessing the public's willingness to pay for the policy outputs as a whole or by modelling the preferences of society for the component attributes of the rural landscape that result from the implementation of the policy. In this article we examine whether the welfare values estimated for an agri-environmental policy are significantly different between an holistic valuation methodology (using contingent valuation) and an attribute-based valuation methodology (choice experiment). It is argued that the valuation methodology chosen should be based on whether or not the overall objective is the valuation of the agri-environment policy package in its entirety or the valuation of each of the policy's distinct environmental outputs.
Resumo:
In polymer extrusion, delivery of a melt which is homogenous in composition and temperature is important for good product quality. However, the process is inherently prone to temperature fluctuations which are difficult to monitor and control via single point based conventional thermo- couples. In this work, the die melt temperature profile was monitored by a thermocouple mesh and the data obtained was used to generate a model to predict the die melt temperature profile. A novel nonlinear model was then proposed which was demonstrated to be in good agreement with training and unseen data. Furthermore, the proposed model was used to select optimum process settings to achieve the desired average melt temperature across the die while improving the temperature homogeneity. The simulation results indicate a reduction in melt temperature variations of up to 60%.