885 resultados para systematic product design
Resumo:
Diplomityössä luodaan viitekehys tuotetiedonhallintajärjestelmän esisuunnittelua varten. Siinä on kolme ulottuvuutta: lisäarvontuotto-, toiminnallisuus- ja ohjelmistoulottuvuus. Viitekehys auttaa- tunnistamaan lisäarvontuottokomponentit, joihin voidaan vaikuttaa tiettyjen ohjelmistoluokkien tarjoamilla tuotetiedonhallintatoiminnallisuuksilla. Viitekehyksen järjestelmäsuunnittelullista näkökulmaa hyödynnetään tutkittavissa yritystapauksissa perustuen laskentamatriisin muotoon mallinnettuihin ulottuvuuksien välisiin suhteisiin. Matriisiin syötetään lisäarvontuotto- ja toiminnallisuuskomponenttien saamat tärkeydet kohdeyrityksessä suoritetussa haastattelututkimuksessa. Matriisin tuotos on tietyn ohjelmiston soveltuvuus kyseisen yrityksen tapauksessa. Soveltuvuus on joukko tunnuslukuja, jotka analysoidaan tulostenkäsittelyvaiheessa. Soveltuvuustulokset avustavat kohdeyritystä sen valitessa lähestymistapaansa tuotetiedonhallintaan - ja kuvaavat esisuunnitellun tuotetiedonhallintajärjestelmän. Viitekehyksen rakentaminen vaatii perinpohjaisen lähestymistavan merkityksellisten lisäarvontuotto- ja toiminnallisuuskomponenttien sekä ohjelmistoluokkien määrittämiseen. Määritystyö perustuu työssä yksityiskohtaisesti laadittujen menetelmien ja komponenttiryhmitysten hyödyntämiselle. Kunkin alueen analysointi mahdollistaa viitekehyksen ja laskentamatriisin rakentamisen yhdenmukaisten määritysten perusteella. Viitekehykselle on ominaista sen muunneltavuus. Nykymuodossaan se soveltuu elektroniikka- ja high-tech yrityksille. Viitekehystä voidaan hyödyntää myös muilla toimialoilla muokkaamalla lisäarvontuottokomponentteja kunkin toimialan intressien mukaisesti. Vastaavasti analysoitava ohjelmisto voidaan valita tapauskohtaisesti. Laskentamatriisi on kuitenkin ensin päivitettävä valitun ohjelmiston kyvykkyyksillä, minkä jälkeen viitekehys voi tuottaa soveltuvuustuloksia kyseiseen yritystapaukseen perustuen
Resumo:
AIMS: Published incidences of acute mountain sickness (AMS) vary widely. Reasons for this variation, and predictive factors of AMS, are not well understood. We aimed to identify predictive factors that are associated with the occurrence of AMS, and to test the hypothesis that study design is an independent predictive factor of AMS incidence. We did a systematic search (Medline, bibliographies) for relevant articles in English or French, up to April 28, 2013. Studies of any design reporting on AMS incidence in humans without prophylaxis were selected. Data on incidence and potential predictive factors were extracted by two reviewers and crosschecked by four reviewers. Associations between predictive factors and AMS incidence were sought through bivariate and multivariate analyses for different study designs separately. Association between AMS incidence and study design was assessed using multiple linear regression. RESULTS: We extracted data from 53,603 subjects from 34 randomized controlled trials, 44 cohort studies, and 33 cross-sectional studies. In randomized trials, the median of AMS incidences without prophylaxis was 60% (range, 16%-100%); mode of ascent and population were significantly associated with AMS incidence. In cohort studies, the median of AMS incidences was 51% (0%-100%); geographical location was significantly associated with AMS incidence. In cross-sectional studies, the median of AMS incidences was 32% (0%-68%); mode of ascent and maximum altitude were significantly associated with AMS incidence. In a multivariate analysis, study design (p=0.012), mode of ascent (p=0.003), maximum altitude (p<0.001), population (p=0.002), and geographical location (p<0.001) were significantly associated with AMS incidence. Age, sex, speed of ascent, duration of exposure, or history of AMS were inconsistently reported and therefore not further analyzed. CONCLUSIONS: Reported incidences and identifiable predictive factors of AMS depend on study design.
Resumo:
In this paper, manufacturability analysis and collection of design aspects is made for a microwave test-fixture. Aspects of applying systematic design for a microwave test-fixture design and manufacturing are also analysed. Special questionnaires for the component and machining are made in order to enable necessary information to ensure DFM(A) – aspects of the component. The aspects of easy manufacturing for machining the microwave test-fixture are collected. Material selection is discussed and manufacturing stages of prototype manufacturing are presented.
Resumo:
The layout design process of the packaging laboratory at Lappeenranta University of Technology is documented in this thesis. Layout planning methods are discussed in general. The systematic layout planning procedure is presented in more detail as it is utilised in the case of layout planning of the packaging laboratory. General demands for research laboratory are discussed both from the machine and product perspectives. The possibilities for commercial food processing in the laboratory are discussed from the point of view of foodstuff processing regulations and hygiene demands. The layout planning process is documented and different layout possibilities are presented. Different layout drafts are evaluated and one layout draft is developed to be the final layout of the packaging laboratory. Guideline for technical planning and implementation based on the final layout is given
Resumo:
The thesis is dedicated to enhancement and development of a Mechanism in Company X in order to increase its key parameters and approve its workability. Current Mechanism model is described in details. The basis of various analysis, models and theories that are reflecting the working process of the Mechanism are included in the thesis. According to these three directions of enhancements are chosen: from mechanical, tribological and conceptual points of view. As the result the list of improvements is presented. The new models of Mechanism are built. The efficiency and lifetime value are obtained in accordance with corresponding estimations. The comparative analysis confirms the necessity of conducted changes. Recommendations for the Company X specialists are represented in the thesis. Proposals for deeper research are also suggested.
Resumo:
The purpose of conducting this thesis is to gather around information about additive manufacturing and to design a product to be additively manufactured. The specific manufacturing method dealt with in this thesis, is powder bed fusion of metals. Therefore when mentioning additive manufacturing in this thesis, it is referred to powder bed fusion of metals. The literature review focuses on the principle of powder bed fusion, the general process chain in additive manufacturing, design rules for additive manufacturing. Examples of success stories in additive manufacturing and reasons for selecting parts to be manufactured with additive manufacturing are also explained in literature review. This knowledge is demanded to understand the experimental part of the thesis. The experimental part of the thesis is divided into two parts. Part A concentrates on finding proper geometry for building self-supporting pipes and proper parameters for support structures of them. Part B of the experimental part concentrates on a case study of designing a product for additive manufacturing. As a result of experimental part A, the design process of self-supporting pipes, results of visual analysis and results of 3D scanning are presented. As a result of experimental part B the design process of the product is presented and compared to the original model.
Resumo:
In areas such as drug development, clinical diagnosis and biotechnology research, acquiring details about the kinetic parameters of enzymes is crucial. The correct design of an experiment is critical to collecting data suitable for analysis, modelling and deriving the correct information. As classical design methods are not targeted to the more complex kinetics being frequently studied, attention is needed to estimate parameters of such models with low variance. We demonstrate that a Bayesian approach (the use of prior knowledge) can produce major gains quantifiable in terms of information, productivity and accuracy of each experiment. Developing the use of Bayesian Utility functions, we have used a systematic method to identify the optimum experimental designs for a number of kinetic model data sets. This has enabled the identification of trends between kinetic model types, sets of design rules and the key conclusion that such designs should be based on some prior knowledge of K-M and/or the kinetic model. We suggest an optimal and iterative method for selecting features of the design such as the substrate range, number of measurements and choice of intermediate points. The final design collects data suitable for accurate modelling and analysis and minimises the error in the parameters estimated. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Modern organisms are adapted to a wide variety of habitats and lifestyles. The processes of evolution have led to complex, interdependent, well-designed mechanisms of todays world and this research challenge is to transpose these innovative solutions to resolve problems in the context of architectural design practice, e.g., to relate design by nature with design by human. In a design by human environment, design synthesis can be performed with the use of rapid prototyping techniques that will enable to transform almost instantaneously any 2D design representation into a physical three-dimensional model, through a rapid prototyping printer machine. Rapid prototyping processes add layers of material one on top of another until a complete model is built and an analogy can be established with design by nature where the natural lay down of earth layers shapes the earth surface, a natural process occurring repeatedly over long periods of time. Concurrence in design will particularly benefit from rapid prototyping techniques, as the prime purpose of physical prototyping is to promptly assist iterative design, enabling design participants to work with a three-dimensional hardcopy and use it for the validation of their design-ideas. Concurrent design is a systematic approach aiming to facilitate the simultaneous involvment and commitment of all participants in the building design process, enabling both an effective reduction of time and costs at the design phase and a quality improvement of the design product. This paper presents the results of an exploratory survey investigating both how computer-aided design systems help designers to fully define the shape of their design-ideas and the extent of the application of rapid prototyping technologies coupled with Internet facilities by design practice. The findings suggest that design practitioners recognize that these technologies can greatly enhance concurrence in design, though acknowledging a lack of knowledge in relation to the issue of rapid prototyping.
Resumo:
Purpose – This paper seeks to examine the nature of “service innovation” in the facilities management (FM) context. It reviews recent thinking on “service innovation” as distinct from “product innovation”. Applying these contemporary perspectives it describes UK case studies of 11 innovations in different FM organisations. These include both in-house client-based innovations and third-party innovations. Design/methodology/approach – The study described in the paper encompasses 11 different innovations that constitute a mix of process, product and practice innovations. All of the innovations stem from UK-based organisations that were subject to in-depth interviews regarding the identification, screening, commitment of resources and implementation of the selected innovations. Findings – The research suggested that service innovation is highly active in the UK FM sector. However, the process of innovation rarely followed a common formalized path. Generally, the innovations were one-shot commitments at the early stage. None of the innovations studied failed to proceed to full adoption stage. This was either due to the reluctance of participating organisations to volunteer “tested but unsuccessful” innovations or the absence of any trial methods that might have exposed an innovations shortcomings. Research limitations/implications – The selection of innovations was restricted to the UK context. Moreover, the choice of innovations was partly determined by the innovating organisation. This selection process appeared to emphasise “one-shot” high profile technological innovations, typically associated with software. This may have been at the expense of less resource intensive, bottom-up innovations. Practical implications – This paper suggests that there is a role for “research and innovation” teams within larger FM organisations, whether they are client-based or third-party. Central to this philosophy is an approach that is open to the possibility of failure. The innovations studied were risk averse with a firm commitment to proceed at the early stage. Originality/value – This paper introduces new thinking on the subject of “service innovation” to the context of FM. It presents research and development as a planned solution to innovation. This approach will enable service organisations to fully test and exploit service innovations.
Resumo:
In addition to technical quality, increasing emphasis is being placed on the importance of elements such as the appearance and meaning of products. To be successful, therefore, attention must be paid to the aesthetic and symbolic functions of objects as well as to reliability and physical quality. Study of the interfaces of these functions may provide a theoretical basis for the ergonomic design of products. The objective of this review is to attempt to establish the nature of these interfaces.
Resumo:
Little is known about the situational contexts in which individuals consume processed sources of dietary sugars. This study aimed to describe the situational contexts associated with the consumption of sweetened food and drink products in a Catholic Middle Eastern Canadian community. A two-stage exploratory sequential mixed-method design was employed with a rationale of triangulation. In stage 1 (n = 62), items and themes describing the situational contexts of sweetened food and drink product consumption were identified from semi-structured interviews and were used to develop the content for the Situational Context Instrument for Sweetened Product Consumption (SCISPC). Face validity, readability and cultural relevance of the instrument were assessed. In stage 2 (n = 192), a cross-sectional study was conducted and exploratory factor analysis was used to examine the structure of themes that emerged from the qualitative analysis as a means of furthering construct validation. The SCISPC reliability and predictive validity on the daily consumption of sweetened products were also assessed. In stage 1, six themes and 40-items describing the situational contexts of sweetened product consumption emerged from the qualitative analysis and were used to construct the first draft of the SCISPC. In stage 2, factor analysis enabled the clarification and/or expansion of the instrument's initial thematic structure. The revised SCISPC has seven factors and 31 items describing the situational contexts of sweetened product consumption. Initial validation of the instrument indicated it has excellent internal consistency and adequate test-retest reliability. Two factors of the SCISPC had predictive validity for the daily consumption of total sugar from sweetened products (Snacking and Energy demands) while the other factors (Socialization, Indulgence, Constraints, Visual Stimuli and Emotional needs) were rather associated to occasional consumption of these products.
Resumo:
Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.
Resumo:
OBJECTIVE: To review the accuracy of electrocardiography in screening for left ventricular hypertrophy in patients with hypertension. DESIGN: Systematic review of studies of test accuracy of six electrocardiographic indexes: the Sokolow-Lyon index, Cornell voltage index, Cornell product index, Gubner index, and Romhilt-Estes scores with thresholds for a positive test of > or =4 points or > or =5 points. DATA SOURCES: Electronic databases ((Pre-)Medline, Embase), reference lists of relevant studies and previous reviews, and experts. STUDY SELECTION: Two reviewers scrutinised abstracts and examined potentially eligible studies. Studies comparing the electrocardiographic index with echocardiography in hypertensive patients and reporting sufficient data were included. DATA EXTRACTION: Data on study populations, echocardiographic criteria, and methodological quality of studies were extracted. DATA SYNTHESIS: Negative likelihood ratios, which indicate to what extent the posterior odds of left ventricular hypertrophy is reduced by a negative test, were calculated. RESULTS: 21 studies and data on 5608 patients were analysed. The median prevalence of left ventricular hypertrophy was 33% (interquartile range 23-41%) in primary care settings (10 studies) and 65% (37-81%) in secondary care settings (11 studies). The median negative likelihood ratio was similar across electrocardiographic indexes, ranging from 0.85 (range 0.34-1.03) for the Romhilt-Estes score (with threshold > or =4 points) to 0.91 (0.70-1.01) for the Gubner index. Using the Romhilt-Estes score in primary care, a negative electrocardiogram result would reduce the typical pre-test probability from 33% to 31%. In secondary care the typical pre-test probability of 65% would be reduced to 63%. CONCLUSION: Electrocardiographic criteria should not be used to rule out left ventricular hypertrophy in patients with hypertension.