915 resultados para Engineering design--Data processing
Resumo:
Background and purpose Survey data quality is a combination of the representativeness of the sample, the accuracy and precision of measurements, data processing and management with several subcomponents in each. The purpose of this paper is to show how, in the final risk factor surveys of the WHO MONICA Project, information on data quality were obtained, quantified, and used in the analysis. Methods and results In the WHO MONICA (Multinational MONItoring of trends and determinants in CArdiovascular disease) Project, the information about the data quality components was documented in retrospective quality assessment reports. On the basis of the documented information and the survey data, the quality of each data component was assessed and summarized using quality scores. The quality scores were used in sensitivity testing of the results both by excluding populations with low quality scores and by weighting the data by its quality scores. Conclusions Detailed documentation of all survey procedures with standardized protocols, training, and quality control are steps towards optimizing data quality. Quantifying data quality is a further step. Methods used in the WHO MONICA Project could be adopted to improve quality in other health surveys.
Resumo:
Performance prediction models for partial face mechanical excavators, when developed in laboratory conditions, depend on relating the results of a set of rock property tests and indices to specific cutting energy (SE) for various rock types. There exist some studies in the literature aiming to correlate the geotechnical properties of intact rocks with the SE, especially for massive and widely jointed rock environments. However, those including direct and/or indirect measures of rock fracture parameters such as rock brittleness and fracture toughness, along with the other rock parameters expressing different aspects of rock behavior under drag tools (picks), are rather limited. With this study, it was aimed to investigate the relationships between the indirect measures of rock brittleness and fracture toughness and the SE depending on the results of a new and two previous linear rock cutting programmes. Relationships between the SE, rock strength parameters, and the rock index tests have also been investigated in this study. Sandstone samples taken from the different fields around Ankara, Turkey were used in the new testing programme. Detailed mineralogical analyses, petrographic studies, and rock mechanics and rock cutting tests were performed on these selected sandstone specimens. The assessment of rock cuttability was based on the SE. Three different brittleness indices (B1, B2, and B4) were calculated for sandstones samples, whereas a toughness index (T-i), being developed by Atkinson et al.(1), was employed to represent the indirect rock fracture toughness. The relationships between the SE and the large amounts of new data obtained from the mineralogical analyses, petrographic studies, rock mechanics, and linear rock cutting tests were evaluated by using bivariate correlation and curve fitting techniques, variance analysis, and Student's t-test. Rock cutting and rock property testing data that came from well-known studies of McFeat-Smith and Fowell(2) and Roxborough and Philips(3) have also been employed in statistical analyses together with the new data. Laboratory tests and subsequent analyses revealed that there were close correlations between the SE and B4 whereas no statistically significant correlation has been found between the SE and T-i. Uniaxial compressive and Brazilian tensile strengths and Shore scleroscope hardness of sandstones also exhibited strong relationships with the SE. NCB cone indenter test had the greatest influence on the SE among the other engineering properties of rocks, confirming the previous studies in rock cutting and mechanical excavation. Therefore, it was recommended to employ easy-to-use index tests of NCB cone indenter and Shore scleroscope in the estimation of laboratory SE of sandstones ranging from very low to high strengths in the absence of a rock cutting rig to measure it until the easy-to-use universal measures of the rock brittleness and especially the rock fracture toughness, being an intrinsic rock property, are developed.
Resumo:
Although managers consider accurate, timely, and relevant information as critical to the quality of their decisions, evidence of large variations in data quality abounds. Over a period of twelve months, the action research project reported herein attempted to investigate and track data quality initiatives undertaken by the participating organisation. The investigation focused on two types of errors: transaction input errors and processing errors. Whenever the action research initiative identified non-trivial errors, the participating organisation introduced actions to correct the errors and prevent similar errors in the future. Data quality metrics were taken quarterly to measure improvements resulting from the activities undertaken during the action research project. The action research project results indicated that for a mission-critical database to ensure and maintain data quality, commitment to continuous data quality improvement is necessary. Also, communication among all stakeholders is required to ensure common understanding of data quality improvement goals. The action research project found that to further substantially improve data quality, structural changes within the organisation and to the information systems are sometimes necessary. The major goal of the action research study is to increase the level of data quality awareness within all organisations and to motivate them to examine the importance of achieving and maintaining high-quality data.
Resumo:
Even when data repositories exhibit near perfect data quality, users may formulate queries that do not correspond to the information requested. Users’ poor information retrieval performance may arise from either problems understanding of the data models that represent the real world systems, or their query skills. This research focuses on users’ understanding of the data structures, i.e., their ability to map the information request and the data model. The Bunge-Wand-Weber ontology was used to formulate three sets of hypotheses. Two laboratory experiments (one using a small data model and one using a larger data model) tested the effect of ontological clarity on users’ performance when undertaking component, record, and aggregate level tasks. The results indicate for the hypotheses associated with different representations but equivalent semantics that parsimonious data model participants performed better for component level tasks but that ontologically clearer data model participants performed better for record and aggregate level tasks.
Resumo:
This paper reviews some basic issues and methods involved in using neural networks to respond in a desired fashion to a temporally-varying environment. Some popular network models and training methods are introduced. A speech recognition example is then used to illustrate the central difficulty of temporal data processing: learning to notice and remember relevant contextual information. Feedforward network methods are applicable to cases where this problem is not severe. The application of these methods are explained and applications are discussed in the areas of pure mathematics, chemical and physical systems, and economic systems. A more powerful but less practical algorithm for temporal problems, the moving targets algorithm, is sketched and discussed. For completeness, a few remarks are made on reinforcement learning.
Resumo:
This paper disputes the fact that product design determines 70% of costs and the implications that follow for design evaluation tools. Using the idea of decision chains, it is argued that such tools need to consider more of the downstream business activities and should take into account the current and future state of the business rather than some idealized view of it. To illustrate the argument, a series of experiments using an enterprise simulator are described that show the benefit from the application of a more holistic 'design for' technique. Design For the Existing Environment.
Resumo:
Purpose – The paper assesses the extent to which China’s comparative advantage in manufacturing has shifted towards higher-tech sectors between 1987 and 2005 and proposes possible explanations for the shift. Design/methodology/approach – Revealed comparative advantage (RCA) indices for 27 product groups, representing high-, medium and low-tech sectors have been calculated. Examination of international market attractiveness complements the RCA analysis. Findings for selected sectors are evaluated in the context of other evidence. Findings – While China maintains its competitiveness in low-tech labour intensive products, it has gained RCA in selected medium-tech sectors (e.g. office machines and electric machinery) and the high-tech telecommunications and automatic data processing equipment sectors. Evidence from firm and sector specific studies suggests that improved comparative advantage in medium and high-tech sectors is based on capabilities developing through combining international technology transfer and learning. Research limitations/implications – The quantitative analysis does not explain the shifts in comparative advantage, though the paper suggests possible explanations. Further research at firm and sector levels is required to understand the underlying capability development of Chinese enterprises and the relative competitiveness of Chinese and foreign invested enterprises. Practical implications – Western companies should take account of capability development in China in forming their international manufacturing strategies. The rapid shifts in China’s comparative advantage have lessons for other industrialising countries. Originality/value – While RCA is a well-known methodology, its application at the disaggregated product group level combined with market attractiveness assessment is distinctive. The paper provides a broad assessment of changes in Chinese manufacturing as a basis for further research on capability development at firm and sector levels.
Resumo:
This thesis describes the development of a complete data visualisation system for large tabular databases, such as those commonly found in a business environment. A state-of-the-art 'cyberspace cell' data visualisation technique was investigated and a powerful visualisation system using it was implemented. Although allowing databases to be explored and conclusions drawn, it had several drawbacks, the majority of which were due to the three-dimensional nature of the visualisation. A novel two-dimensional generic visualisation system, known as MADEN, was then developed and implemented, based upon a 2-D matrix of 'density plots'. MADEN allows an entire high-dimensional database to be visualised in one window, while permitting close analysis in 'enlargement' windows. Selections of records can be made and examined, and dependencies between fields can be investigated in detail. MADEN was used as a tool for investigating and assessing many data processing algorithms, firstly data-reducing (clustering) methods, then dimensionality-reducing techniques. These included a new 'directed' form of principal components analysis, several novel applications of artificial neural networks, and discriminant analysis techniques which illustrated how groups within a database can be separated. To illustrate the power of the system, MADEN was used to explore customer databases from two financial institutions, resulting in a number of discoveries which would be of interest to a marketing manager. Finally, the database of results from the 1992 UK Research Assessment Exercise was analysed. Using MADEN allowed both universities and disciplines to be graphically compared, and supplied some startling revelations, including empirical evidence of the 'Oxbridge factor'.
Resumo:
The analysis and prediction of the dynamic behaviour of s7ructural components plays an important role in modern engineering design. :n this work, the so-called "mixed" finite element models based on Reissnen's variational principle are applied to the solution of free and forced vibration problems, for beam and :late structures. The mixed beam models are obtained by using elements of various shape functions ranging from simple linear to complex cubic and quadratic functions. The elements were in general capable of predicting the natural frequencies and dynamic responses with good accuracy. An isoparametric quadrilateral element with 8-nodes was developed for application to thin plate problems. The element has 32 degrees of freedom (one deflection, two bending and one twisting moment per node) which is suitable for discretization of plates with arbitrary geometry. A linear isoparametric element and two non-conforming displacement elements (4-node and 8-node quadrilateral) were extended to the solution of dynamic problems. An auto-mesh generation program was used to facilitate the preparation of input data required by the 8-node quadrilateral elements of mixed and displacement type. Numerical examples were solved using both the mixed beam and plate elements for predicting a structure's natural frequencies and dynamic response to a variety of forcing functions. The solutions were compared with the available analytical and displacement model solutions. The mixed elements developed have been found to have significant advantages over the conventional displacement elements in the solution of plate type problems. A dramatic saving in computational time is possible without any loss in solution accuracy. With beam type problems, there appears to be no significant advantages in using mixed models.
Resumo:
The aim of the research project was to gain d complete and accurate accounting of the needs and deficiencies of materials selection and design data, with particular attention given to the feasibility of a computerised materials selection system that would include application analysis, property data and screening techniques. The project also investigates and integrates the three major aspects of materials resources, materials selection and materials recycling. Consideration of the materials resource base suggests that, though our discovery potential has increased, geologic availability is the ultimate determinant and several metals may well become scarce at the same time, thus compounding the problem of substitution. With around 2- to 20- million units of engineering materials data, the use of a computer is the only logical answer for scientific selection of materials. The system developed at Aston is used for data storage, mathematical computation and output. The system enables programs to be run in batch and interactive (on-line) mode. The program with modification can also handle such variables as quantity of mineral resources, energy cost of materials and depletion and utilisation rates of strateqic materials. The work also carries out an in-depth study of copper recycling in the U.K. and concludes that, somewhere in the region of 2 million tonnes of copper is missing from the recycling cycle. It also sets out guidelines on product design and conservation policies from the recyclability point of view.