694 resultados para Solid modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Microwave power is used for heating and drying processes because of its faster and volumetric heating capability. Non-uniform temperature distribution during microwave application is a major drawback of these processes. Intermittent application of microwave potentially reduces the impact of non-uniformity and improves energy efficiency by redistributing the temperature. However, temperature re-distribution during intermittent microwave heating has not been investigated adequately. Consequently, in this study, a coupled electromagnetic with heat and mass transfer model was developed using the finite element method embedded in COMSOL-Multyphysics software. Particularly, the temperature redistribution due to intermittent heating was investigated. A series of experiments were performed to validate the simulation. The test specimen was an apple and the temperature distribution was closely monitored by a TIC (Thermal Imaging Camera). The simulated temperature profile matched closely with thermal images obtained from experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Retired business professionals represent an unexplored source of skill support for struggling rural communities. This research examined the feasibility of drawing on this valuable pool of knowledge and experience by engaging retirees in short term, project based volunteering roles in rural, not for profit agencies. Using the theory of planned behaviour and the functional approach to volunteering, the program of study generated a model comprising the key psychological and contextual factors determining the volunteers' decision to provide skill assistance in rural settings. The model provides a useful resource for creating suitable volunteering opportunities and for informing volunteer recruitment strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Process modelling – the design and use of graphical documentations of an organisation’s business processes – is a key method to document and use information about business processes. Still, despite current interest in process modelling, this research area faces essential challenges. Key unanswered questions concern the impact of process modelling in organisational practice, and the mechanisms through which impacts are developed. To answer these questions and to provide a better understanding of process modelling impact, I turn to the concept of affordances. Affordances describe the possibilities for goal-oriented action that technical objects offer to specified users. This notion has received growing attention from IS researchers. I report on my efforts to further develop the IS discipline’s understanding of affordances and impacts from informational objects, such as process models used by analysts for purposes of information systems analysis and design. Specifically, I seek to extend existing theory on the emergence and actualisation of affordances. I develop a research model that describes the process by which affordances are perceived and actualised and explain their dependence on available information and actualisation effort. I present my plans for operationalising and testing this research model empirically, and provide details about my design of a full-cycle, mixed methods study currently in progress.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thalidomide is an anti-angiogenic agent currently used to treat patients with malignant cachexia or multiple myeloma. Lenalidomide (CC-5013) is an immunomodulatory thalidomide analogue licensed in the United States of America (USA) for the treatment of a subtype of myelodysplastic syndrome. This two-centre, open-label phase I study evaluated dose-limiting toxicities in 55 patients with malignant solid tumours refractory to standard chemotherapies. Lenalidomide capsules were consumed once daily for 12 weeks according to one of the following three schedules: (I) 25 mg daily for the first 7 d, the daily dose increased by 25 mg each week up to a maximum daily dose of 150 mg; (II) 25 mg daily for 21 d followed by a 7-d rest period, the 4-week cycle repeated for 3 cycles; (III) 10 mg daily continuously. Twenty-six patients completed the study period. Two patients experienced a grade 3 hypersensitivity rash. Four patients in cohort I and 4 patients in cohort II suffered grade 3 or 4 neutropaenia. In 2 patients with predisposing medical factors, grade 3 cardiac dysrhythmia was recorded. Grade 1 neurotoxicity was detected in 6 patients. One complete and two partial radiological responses were measured by computed tomography scanning; 8 patients had stable disease after 12 weeks of treatment. Fifteen patients remained on treatment as named patients; 1 with metastatic melanoma remains in clinical remission 3.5 years from trial entry. This study indicates the tolerability and potential clinical efficacy of lenalidomide in patients with advanced solid tumours who have previously received multi-modality treatment. Depending on the extent of myelosuppressive pre-treatment, dose schedules (II) or (III) are advocated for large-scale trials of long-term administration. © 2006 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Double-pass counter flow v-grove collector is considered one of the most efficient solar air-collectors. In this design of the collector, the inlet air initially flows at the top part of the collector and changes direction once it reaches the end of the collector and flows below the collector to the outlet. A mathematical model is developed for this type of collector and simulation is carried out using MATLAB programme. The simulation results were verified with three distinguished research results and it was found that the simulation has the ability to predict the performance of the air collector accurately as proven by the comparison of experimental data with simulation. The difference between the predicted and experimental results is, at maximum, approximately 7% which is within the acceptable limit considering some uncertainties in the input parameter values to allow comparison. A parametric study was performed and it was found that solar radiation, inlet air temperature, flow rate and length have a significant effect on the efficiency of the air collector. Additionally, the results are compared with single flow V-groove collector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Articular cartilage is the load-bearing tissue that consists of proteoglycan macromolecules entrapped between collagen fibrils in a three-dimensional architecture. To date, the drudgery of searching for mathematical models to represent the biomechanics of such a system continues without providing a fitting description of its functional response to load at micro-scale level. We believe that the major complication arose when cartilage was first envisaged as a multiphasic model with distinguishable components and that quantifying those and searching for the laws that govern their interaction is inadequate. To the thesis of this paper, cartilage as a bulk is as much continuum as is the response of its components to the external stimuli. For this reason, we framed the fundamental question as to what would be the mechano-structural functionality of such a system in the total absence of one of its key constituents-proteoglycans. To answer this, hydrated normal and proteoglycan depleted samples were tested under confined compression while finite element models were reproduced, for the first time, based on the structural microarchitecture of the cross-sectional profile of the matrices. These micro-porous in silico models served as virtual transducers to produce an internal noninvasive probing mechanism beyond experimental capabilities to render the matrices micromechanics and several others properties like permeability, orientation etc. The results demonstrated that load transfer was closely related to the microarchitecture of the hyperelastic models that represent solid skeleton stress and fluid response based on the state of the collagen network with and without the swollen proteoglycans. In other words, the stress gradient during deformation was a function of the structural pattern of the network and acted in concert with the position-dependent compositional state of the matrix. This reveals that the interaction between indistinguishable components in real cartilage is superimposed by its microarchitectural state which directly influences macromechanical behavior.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis gave a brief idea about removal fluoride using acid and thermally treated red mud. It is showed the importance of having a low and consistent PH, and the appropriate temperature for the removal of fluoride from aqueous solutions using red mud. According the data analyse, keep red mud in 1000°C and PH value around 4 can achieve the greatest fluoride adsorption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the widespread of social media websites in the internet, and the huge number of users participating and generating infinite number of contents in these websites, the need for personalisation increases dramatically to become a necessity. One of the major issues in personalisation is building users’ profiles, which depend on many elements; such as the used data, the application domain they aim to serve, the representation method and the construction methodology. Recently, this area of research has been a focus for many researchers, and hence, the proposed methods are increasing very quickly. This survey aims to discuss the available user modelling techniques for social media websites, and to highlight the weakness and strength of these methods and to provide a vision for future work in user modelling in social media websites.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study considered the problem of predicting survival, based on three alternative models: a single Weibull, a mixture of Weibulls and a cure model. Instead of the common procedure of choosing a single “best” model, where “best” is defined in terms of goodness of fit to the data, a Bayesian model averaging (BMA) approach was adopted to account for model uncertainty. This was illustrated using a case study in which the aim was the description of lymphoma cancer survival with covariates given by phenotypes and gene expression. The results of this study indicate that if the sample size is sufficiently large, one of the three models emerge as having highest probability given the data, as indicated by the goodness of fit measure; the Bayesian information criterion (BIC). However, when the sample size was reduced, no single model was revealed as “best”, suggesting that a BMA approach would be appropriate. Although a BMA approach can compromise on goodness of fit to the data (when compared to the true model), it can provide robust predictions and facilitate more detailed investigation of the relationships between gene expression and patient survival. Keywords: Bayesian modelling; Bayesian model averaging; Cure model; Markov Chain Monte Carlo; Mixture model; Survival analysis; Weibull distribution

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes techniques to improve the performance of i-vector based speaker verification systems when only short utterances are available. Short-length utterance i-vectors vary with speaker, session variations, and the phonetic content of the utterance. Well established methods such as linear discriminant analysis (LDA), source-normalized LDA (SN-LDA) and within-class covariance normalisation (WCCN) exist for compensating the session variation but we have identified the variability introduced by phonetic content due to utterance variation as an additional source of degradation when short-duration utterances are used. To compensate for utterance variations in short i-vector speaker verification systems using cosine similarity scoring (CSS), we have introduced a short utterance variance normalization (SUVN) technique and a short utterance variance (SUV) modelling approach at the i-vector feature level. A combination of SUVN with LDA and SN-LDA is proposed to compensate the session and utterance variations and is shown to provide improvement in performance over the traditional approach of using LDA and/or SN-LDA followed by WCCN. An alternative approach is also introduced using probabilistic linear discriminant analysis (PLDA) approach to directly model the SUV. The combination of SUVN, LDA and SN-LDA followed by SUV PLDA modelling provides an improvement over the baseline PLDA approach. We also show that for this combination of techniques, the utterance variation information needs to be artificially added to full-length i-vectors for PLDA modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L'intérêt suscité par la ré-ingénierie des processus et les technologies de l'information révèle l'émergence du paradigme du management par les processus. Bien que beaucoup d'études aient été publiées sur des outils et techniques alternatives de modélisation de processus, peu d'attention a été portée à l'évaluation post-hoc des activités de modélisation de processus ou à l'établissement de directives sur la façon de conduire efficacement une modélisation de processus. La présente étude a pour objectif de combler ce manque. Nous présentons les résultats d'une étude de cas détaillée, conduite dans une organisation leader australienne dans le but de construire un modèle de réussite de la modélisation des processus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global awareness for cleaner and renewable energy is transforming the electricity sector at many levels. New technologies are being increasingly integrated into the electricity grid at high, medium and low voltage levels, new taxes on carbon emissions are being introduced and individuals can now produce electricity, mainly through rooftop photovoltaic (PV) systems. While leading to improvements, these changes also introduce challenges, and a question that often rises is ‘how can we manage this constantly evolving grid?’ The Queensland Government and Ergon Energy, one of the two Queensland distribution companies, have partnered with some Australian and German universities on a project to answer this question in a holistic manner. The project investigates the impact the integration of renewables and other new technologies has on the physical structure of the grid, and how this evolving system can be managed in a sustainable and economical manner. To aid understanding of what the future might bring, a software platform has been developed that integrates two modelling techniques: agent-based modelling (ABM) to capture the characteristics of the different system units accurately and dynamically, and particle swarm optimization (PSO) to find the most economical mix of network extension and integration of distributed generation over long periods of time. Using data from Ergon Energy, two types of networks (3 phase, and Single Wired Earth Return or SWER) have been modelled; three-phase networks are usually used in dense networks such as urban areas, while SWER networks are widely used in rural Queensland. Simulations can be performed on these networks to identify the required upgrades, following a three-step process: a) what is already in place and how it performs under current and future loads, b) what can be done to manage it and plan the future grid and c) how these upgrades/new installations will perform over time. The number of small-scale distributed generators, e.g. PV and battery, is now sufficient (and expected to increase) to impact the operation of the grid, which in turn needs to be considered by the distribution network manager when planning for upgrades and/or installations to stay within regulatory limits. Different scenarios can be simulated, with different levels of distributed generation, in-place as well as expected, so that a large number of options can be assessed (Step a). Once the location, sizing and timing of assets upgrade and/or installation are found using optimisation techniques (Step b), it is possible to assess the adequacy of their daily performance using agent-based modelling (Step c). One distinguishing feature of this software is that it is possible to analyse a whole area at once, while still having a tailored solution for each of the sub-areas. To illustrate this, using the impact of battery and PV can have on the two types of networks mentioned above, three design conditions can be identified (amongst others): · Urban conditions o Feeders that have a low take-up of solar generators, may benefit from adding solar panels o Feeders that need voltage support at specific times, may be assisted by installing batteries · Rural conditions - SWER network o Feeders that need voltage support as well as peak lopping may benefit from both battery and solar panel installations. This small example demonstrates that no single solution can be applied across all three areas, and there is a need to be selective in which one is applied to each branch of the network. This is currently the function of the engineer who can define various scenarios against a configuration, test them and iterate towards an appropriate solution. Future work will focus on increasing the level of automation in identifying areas where particular solutions are applicable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stigmergy is a biological term used when discussing a sub-set of insect swarm-behaviour describing the apparent organisation seen during their activities. Stigmergy describes a communication mechanism based on environment-mediated signals which trigger responses among the insects. This phenomenon is demonstrated in the behavior of ants and their food gathering process when following pheromone trails, where the pheromones are a form of environment-mediated communication. What is interesting with this phenomenon is that highly organized societies are achieved without an apparent management structure. Stigmergy is also observed in human environments, both natural and engineered. It is implicit in the Web where sites provide a virtual environment supporting coordinative contributions. Researchers in varying disciplines appreciate the power of this phenomenon and have studied how to exploit it. As stigmergy becomes more widely researched we see its definition mutate as papers citing original work become referenced themselves. Each paper interprets these works in ways very specific to the research being conducted. Our own research aims to better understand what improves the collaborative function of a Web site when exploiting the phenomenon. However when researching stigmergy to develop our understanding we discover a lack of a standardized and abstract model for the phenomenon. Papers frequently cited the same generic descriptions before becoming intimately focused on formal specifications of an algorithm, or esoteric discussions regarding sub-facets of the topic. None provide a holistic and macro-level view to model and standardize the nomenclature. This paper provides a content analysis of influential literature documenting the numerous theoretical and experimental papers that have focused on stigmergy. We establish that stigmergy is a phenomenon that transcends the insect world and is more than just a metaphor when applied to the human world. We present from our own research our general theory and abstract model of semantics of stigma in stigmergy. We hope our model will clarify the nuances of the phenomenon into a useful road-map, and standardise vocabulary that we witness becoming confused and divergent. Furthermore, this paper documents the analysis on which we base our next paper: Special Theory of Stigmergy: A Design Pattern for Web 2.0 Collaboration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many mature term-based or pattern-based approaches have been used in the field of information filtering to generate users’ information needs from a collection of documents. A fundamental assumption for these approaches is that the documents in the collection are all about one topic. However, in reality users’ interests can be diverse and the documents in the collection often involve multiple topics. Topic modelling, such as Latent Dirichlet Allocation (LDA), was proposed to generate statistical models to represent multiple topics in a collection of documents, and this has been widely utilized in the fields of machine learning and information retrieval, etc. But its effectiveness in information filtering has not been so well explored. Patterns are always thought to be more discriminative than single terms for describing documents. However, the enormous amount of discovered patterns hinder them from being effectively and efficiently used in real applications, therefore, selection of the most discriminative and representative patterns from the huge amount of discovered patterns becomes crucial. To deal with the above mentioned limitations and problems, in this paper, a novel information filtering model, Maximum matched Pattern-based Topic Model (MPBTM), is proposed. The main distinctive features of the proposed model include: (1) user information needs are generated in terms of multiple topics; (2) each topic is represented by patterns; (3) patterns are generated from topic models and are organized in terms of their statistical and taxonomic features, and; (4) the most discriminative and representative patterns, called Maximum Matched Patterns, are proposed to estimate the document relevance to the user’s information needs in order to filter out irrelevant documents. Extensive experiments are conducted to evaluate the effectiveness of the proposed model by using the TREC data collection Reuters Corpus Volume 1. The results show that the proposed model significantly outperforms both state-of-the-art term-based models and pattern-based models

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spreading cell fronts play an essential role in many physiological processes. Classically, models of this process are based on the Fisher-Kolmogorov equation; however, such continuum representations are not always suitable as they do not explicitly represent behaviour at the level of individual cells. Additionally, many models examine only the large time asymptotic behaviour, where a travelling wave front with a constant speed has been established. Many experiments, such as a scratch assay, never display this asymptotic behaviour, and in these cases the transient behaviour must be taken into account. We examine the transient and asymptotic behaviour of moving cell fronts using techniques that go beyond the continuum approximation via a volume-excluding birth-migration process on a regular one-dimensional lattice. We approximate the averaged discrete results using three methods: (i) mean-field, (ii) pair-wise, and (iii) one-hole approximations. We discuss the performace of these methods, in comparison to the averaged discrete results, for a range of parameter space, examining both the transient and asymptotic behaviours. The one-hole approximation, based on techniques from statistical physics, is not capable of predicting transient behaviour but provides excellent agreement with the asymptotic behaviour of the averaged discrete results, provided that cells are proliferating fast enough relative to their rate of migration. The mean-field and pair-wise approximations give indistinguishable asymptotic results, which agree with the averaged discrete results when cells are migrating much more rapidly than they are proliferating. The pair-wise approximation performs better in the transient region than does the mean-field, despite having the same asymptotic behaviour. Our results show that each approximation only works in specific situations, thus we must be careful to use a suitable approximation for a given system, otherwise inaccurate predictions could be made.