975 resultados para product modelling
Resumo:
Asset management (AM) processes play an important role in assisting enterprises to manage their assets more efficiently. To visualise and improve AM processes, the processes need to be modelled using certain process modelling methodologies. Understanding the requirements for AM process modelling is essential for selecting or developing effective AM process modelling methodologies. However, little research has been done on analysing the requirements. This paper attempts to fill this gap by investigating the features of AM processes. It is concluded that AM process modelling requires intuitive representation of its processes, ‘fast’ implementation of the process modelling, effective evaluation of the processes and sound system integration.
Resumo:
This paper describes a generalised linear mixed model (GLMM) approach for understanding spatial patterns of participation in population health screening, in the presence of multiple screening facilities. The models presented have dual focus, namely the prediction of expected patient flows from regions to services and relative rates of participation by region- service combination, with both outputs having meaningful implications for the monitoring of current service uptake and provision. The novelty of this paper lies with the former focus, and an approach for distributing expected participation by region based on proximity to services is proposed. The modelling of relative rates of participation is achieved through the combination of different random effects, as a means of assigning excess participation to different sources. The methodology is applied to participation data collected from a government-funded mammography program in Brisbane, Australia.
Resumo:
My quantitative study asks how Chinese Australians’ “Chineseness” and their various resources influence their Chinese language proficiency, using online survey and snowball sampling. ‘Operationalization’ is a challenging process which ensures that the survey design talks back to the informing theory and forwards to the analysis model. It requires the attention to two core methodological concerns, namely ‘validity’ and ‘reliability’. Construction of a high-quality questionnaire is critical to the achievement of valid and reliable operationalization. A series of strategies were chosen to ensure the quality of the questions, and thus the eventual data. These strategies enable the use of structural equation modelling to examine how well the data fits the theoretical framework, which was constructed in light of Bourdieu’s theory of habitus, capital and field.
Resumo:
Recently many international tertiary educational programs have capitalised on the value design and business can have upon their interception (Martin, 2009; Brown, 2008; Bruce and Bessant, 2002; Manzini, 2009). This paper discusses the role that two teaching units – New Product Development and Design Led Innovation – play in forming an understanding of commercialisation needed in today’s Industrial Design education. These units are taught consecutively in the later years of the Bachelor of Industrial Design program at the Queensland University of Technology, Brisbane, Australia. In this paper, each teaching unit is discussed in detail and then as a conglomerate, in order to form a basis of knowledge students need in order to fully capitalise on the value design has in business, and to produce a more capable Industrial Design graduate of the future.
Resumo:
This research paper explores the impact product personalisation has upon product attachment and aims to develop a deeper understanding of why, how and if consumers choose to do so. The current research in this field is mainly based on attachment theories and is predominantly product specific. This paper researches the link between product attachment and personalisation through in-depth, semi-structured interviews, where the data has been thematically analysed and broken down into three themes, and nine sub-themes. It was found that participants did become more attached to products once they were personalised and the reasons why this occurred varied. The most common reasons that led to personalisation were functionality and usability, the expression of personality through a product and the complexity of personalisation. The reasons why participants felt connected to their products included strong emotions/memories, the amount of time and effort invested into the personalisation, a sense of achievement. Reasons behind the desire for personalisation included co-designing, expression of uniqueness/individualism and having choice for personalisation. Through theme and inter-theme relationships, many correlations were formed, which created the basis for design recommendations. These recommendations demonstrate how a designer could implement the emotions and reasoning for personalisation into the design process.
Resumo:
Much has been written on Michel Foucault’s reluctance to clearly delineate a research method, particularly with respect to genealogy (Harwood 2000; Meadmore, Hatcher, & McWilliam 2000; Tamboukou 1999). Foucault (1994, p. 288) himself disliked prescription stating, “I take care not to dictate how things should be” and wrote provocatively to disrupt equilibrium and certainty, so that “all those who speak for others or to others” no longer know what to do. It is doubtful, however, that Foucault ever intended for researchers to be stricken by that malaise to the point of being unwilling to make an intellectual commitment to methodological possibilities. Taking criticism of “Foucauldian” discourse analysis as a convenient point of departure to discuss the objectives of poststructural analyses of language, this paper develops what might be called a discursive analytic; a methodological plan to approach the analysis of discourses through the location of statements that function with constitutive effects.
Resumo:
Citizen Science projects are initiatives in which members of the general public participate in scientific research projects and perform or manage research-related tasks such as data collection and/or data annotation. Citizen Science is technologically possible and scientifically significant. However, although research teams can save time and money by recruiting general citizens to volunteer their time and skills to help data analysis, the reliability of contributed data varies a lot. Data reliability issues are significant to the domain of Citizen Science due to the quantity and diversity of people and devices involved. Participants may submit low quality, misleading, inaccurate, or even malicious data. Therefore, finding a way to improve the data reliability has become an urgent demand. This study aims to investigate techniques to enhance the reliability of data contributed by general citizens in scientific research projects especially for acoustic sensing projects. In particular, we propose to design a reputation framework to enhance data reliability and also investigate some critical elements that should be aware of during developing and designing new reputation systems.
Resumo:
In this paper we investigate the distribution of the product of Rayleigh distributed random variables. Considering the Mellin-Barnes inversion formula and using the saddle point approach we obtain an upper bound for the product distribution. The accuracy of this tail-approximation increases as the number of random variables in the product increase.
Resumo:
Incorporating design thinking as a generic capability at a school level is needed to ensure future generations are empowered for business innovation and active citizenship. This paper describes the methodology of an investigation into modelling design led innovation approaches from the business sector to secondary education, as part of a larger study. It builds on a previously discussed research agenda by outlining the scope, significance and limitations of currently available research in this area, examining an action research methodology utilising an Australian design immersion program case study, and discussing implications and future work. It employs a triangulated approach encompassing thematic analysis of qualitative data collection from student focus groups, semi-structured convergent interviews with teachers and facilitators, and student journals. Eventual outcomes will be reviewed and analysed within the framework of a proposed innovation matrix model for educational growth, synthesising principles responding to 21st century student outcomes. It is anticipated this research will inform a successful design led secondary education innovation model, facilitating new engagement frameworks between tertiary and secondary education sectors, as well as providing new insight into the suitability of action research in prototyping social innovation in Australia.
Resumo:
A critical step in the dissemination of ovarian cancer is the formation of multicellular spheroids from cells shed from the primary tumour. The objectives of this study were to apply bioengineered three-dimensional (3D) microenvironments for culturing ovarian cancer spheroids in vitro and simultaneously to build on a mathematical model describing the growth of multicellular spheroids in these biomimetic matrices. Cancer cells derived from human epithelial ovarian carcinoma were embedded within biomimetic hydrogels of varying stiffness and grown for up to 4 weeks. Immunohistochemistry, imaging and growth analyses were used to quantify the dependence of cell proliferation and apoptosis on matrix stiffness, long-term culture and treatment with the anti-cancer drug paclitaxel. The mathematical model was formulated as a free boundary problem in which each spheroid was treated as an incompressible porous medium. The functional forms used to describe the rates of cell proliferation and apoptosis were motivated by the experimental work and predictions of the mathematical model compared with the experimental output. This work aimed to establish whether it is possible to simulate solid tumour growth on the basis of data on spheroid size, cell proliferation and cell death within these spheroids. The mathematical model predictions were in agreement with the experimental data set and simulated how the growth of cancer spheroids was influenced by mechanical and biochemical stimuli including matrix stiffness, culture duration and administration of a chemotherapeutic drug. Our computational model provides new perspectives on experimental results and has informed the design of new 3D studies of chemoresistance of multicellular cancer spheroids.
Resumo:
Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality data sets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares Regression and Bayesian Weighted Least Squares Regression for the estimation of uncertainty associated with pollutant build-up prediction using limited data sets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in the prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling.
Resumo:
Fire safety has become an important part in structural design due to the ever increasing loss of properties and lives during fires. Fire rating of load bearing wall systems made of Light gauge Steel Frames (LSF) is determined using fire tests based on the standard time-temperature curve given in ISO 834. However, modern residential buildings make use of thermoplastic materials, which mean considerably high fuel loads. Hence a detailed fire research study into the performance of load bearing LSF walls was undertaken using a series of realistic design fire curves developed based on Eurocode parametric curves and Barnett’s BFD curves. It included both full scale fire tests and numerical studies of LSF walls without any insulation, and the recently developed externally insulated composite panels. This paper presents the details of fire tests first, and then the numerical models of tested LSF wall studs. It shows that suitable finite element models can be developed to predict the fire rating of load bearing walls under real fire conditions. The paper also describes the structural and fire performances of externally insulated LSF walls in comparison to the non-insulated walls under real fires, and highlights the effects of standard and real fire curves on fire performance of LSF walls.
Resumo:
In this work we discuss the effects of white and coloured noise perturbations on the parameters of a mathematical model of bacteriophage infection introduced by Beretta and Kuang in [Math. Biosc. 149 (1998) 57]. We numerically simulate the strong solutions of the resulting systems of stochastic ordinary differential equations (SDEs), with respect to the global error, by means of numerical methods of both Euler-Taylor expansion and stochastic Runge-Kutta type.
Resumo:
As one of the longest running franchises in cinema history, and with its well-established use of product placements, the James Bond film series provides an ideal framework within which to measure and catalogue the number and types of products used within a particular timeframe. This case study will draw upon extensive content analysis of the James Bond film series in order to chart the evolution of product placement across the franchise's 50 year history.
Resumo:
Articular cartilage is a complex structure with an architecture in which fluid-swollen proteoglycans constrained within a 3D network of collagen fibrils. Because of the complexity of the cartilage structure, the relationship between its mechanical behaviours at the macroscale level and its components at the micro-scale level are not completely understood. The research objective in this thesis is to create a new model of articular cartilage that can be used to simulate and obtain insight into the micro-macro-interaction and mechanisms underlying its mechanical responses during physiological function. The new model of articular cartilage has two characteristics, namely: i) not use fibre-reinforced composite material idealization ii) Provide a framework for that it does probing the micro mechanism of the fluid-solid interaction underlying the deformation of articular cartilage using simple rules of repartition instead of constitutive / physical laws and intuitive curve-fitting. Even though there are various microstructural and mechanical behaviours that can be studied, the scope of this thesis is limited to osmotic pressure formation and distribution and their influence on cartilage fluid diffusion and percolation, which in turn governs the deformation of the compression-loaded tissue. The study can be divided into two stages. In the first stage, the distributions and concentrations of proteoglycans, collagen and water were investigated using histological protocols. Based on this, the structure of cartilage was conceptualised as microscopic osmotic units that consist of these constituents that were distributed according to histological results. These units were repeated three-dimensionally to form the structural model of articular cartilage. In the second stage, cellular automata were incorporated into the resulting matrix (lattice) to simulate the osmotic pressure of the fluid and the movement of water within and out of the matrix; following the osmotic pressure gradient in accordance with the chosen rule of repartition of the pressure. The outcome of this study is the new model of articular cartilage that can be used to simulate and study the micromechanical behaviours of cartilage under different conditions of health and loading. These behaviours are illuminated at the microscale level using the socalled neighbourhood rules developed in the thesis in accordance with the typical requirements of cellular automata modelling. Using these rules and relevant Boundary Conditions to simulate pressure distribution and related fluid motion produced significant results that provided the following insight into the relationships between osmotic pressure gradient and associated fluid micromovement, and the deformation of the matrix. For example, it could be concluded that: 1. It is possible to model articular cartilage with the agent-based model of cellular automata and the Margolus neighbourhood rule. 2. The concept of 3D inter connected osmotic units is a viable structural model for the extracellular matrix of articular cartilage. 3. Different rules of osmotic pressure advection lead to different patterns of deformation in the cartilage matrix, enabling an insight into how this micromechanism influences macromechanical deformation. 4. When features such as transition coefficient were changed, permeability (representing change) is altered due to the change in concentrations of collagen, proteoglycans (i.e. degenerative conditions), the deformation process is impacted. 5. The boundary conditions also influence the relationship between osmotic pressure gradient and fluid movement at the micro-scale level. The outcomes are important to cartilage research since we can use these to study the microscale damage in the cartilage matrix. From this, we are able to monitor related diseases and their progression leading to potential insight into drug-cartilage interaction for treatment. This innovative model is an incremental progress on attempts at creating further computational modelling approaches to cartilage research and other fluid-saturated tissues and material systems.