800 resultados para integral model
em Queensland University of Technology - ePrints Archive
Resumo:
Fleck and Johnson (Int. J. Mech. Sci. 29 (1987) 507) and Fleck et al. (Proc. Inst. Mech. Eng. 206 (1992) 119) have developed foil rolling models which allow for large deformations in the roll profile, including the possibility that the rolls flatten completely. However, these models require computationally expensive iterative solution techniques. A new approach to the approximate solution of the Fleck et al. (1992) Influence Function Model has been developed using both analytic and approximation techniques. The numerical difficulties arising from solving an integral equation in the flattened region have been reduced by applying an Inverse Hilbert Transform to get an analytic expression for the pressure. The method described in this paper is applicable to cases where there is or there is not a flat region.
Resumo:
With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.
Resumo:
The ICU is an integral part of any hospital and is under great load from patient arrivals as well as resource limitations. Scheduling of patients in the ICU is complicated by the two general types; elective surgery and emergency arrivals. This complicated situation is handled by creating a tentative initial schedule and then reacting to uncertain arrivals as they occur. For most hospitals there is little or no flexibility in the number of beds that are available for use now or in the future. We propose an integer programming model to handle a parallel machine reacting system for scheduled and unscheduled arrivals.
Resumo:
The collaboration of clinicians with basic science researchers is crucial for addressing clinically relevant research questions. In order to initiate such mutually beneficial relationships, we propose a model where early career clinicians spend a designated time embedded in established basic science research groups, in order to pursue a postgraduate qualification. During this time, clinicians become integral members of the research team, fostering long term relationships and opening up opportunities for continuing collaboration. However, for these collaborations to be successful there are pitfalls to be avoided. Limited time and funding can lead to attempts to answer clinical challenges with highly complex research projects characterised by a large number of "clinical" factors being introduced in the hope that the research outcomes will be more clinically relevant. As a result, the complexity of such studies and variability of its outcomes may lead to difficulties in drawing scientifically justified and clinically useful conclusions. Consequently, we stress that it is the basic science researcher and the clinician's obligation to be mindful of the limitations and challenges of such multi-factorial research projects. A systematic step-by-step approach to address clinical research questions with limited, but highly targeted and well defined research projects provides the solid foundation which may lead to the development of a longer term research program for addressing more challenging clinical problems. Ultimately, we believe that it is such models, encouraging the vital collaboration between clinicians and researchers for the work on targeted, well defined research projects, which will result in answers to the important clinical challenges of today.
Resumo:
In recent years, ocean scientists have started to employ many new forms of technology as integral pieces in oceanographic data collection for the study and prediction of complex and dynamic ocean phenomena. One area of technological advancement in ocean sampling if the use of Autonomous Underwater Vehicles (AUVs) as mobile sensor plat- forms. Currently, most AUV deployments execute a lawnmower- type pattern or repeated transects for surveys and sampling missions. An advantage of these missions is that the regularity of the trajectory design generally makes it easier to extract the exact path of the vehicle via post-processing. However, if the deployment region for the pattern is poorly selected, the AUV can entirely miss collecting data during an event of specific interest. Here, we consider an innovative technology toolchain to assist in determining the deployment location and executed paths for AUVs to maximize scientific information gain about dynamically evolving ocean phenomena. In particular, we provide an assessment of computed paths based on ocean model predictions designed to put AUVs in the right place at the right time to gather data related to the understanding of algal and phytoplankton blooms.
Resumo:
A model for drug diffusion from a spherical polymeric drug delivery device is considered. The model contains two key features. The first is that solvent diffuses into the polymer, which then transitions from a glassy to a rubbery state. The interface between the two states of polymer is modelled as a moving boundary, whose speed is governed by a kinetic law; the same moving boundary problem arises in the one-phase limit of a Stefan problem with kinetic undercooling. The second feature is that drug diffuses only through the rubbery region, with a nonlinear diffusion coefficient that depends on the concentration of solvent. We analyse the model using both formal asymptotics and numerical computation, the latter by applying a front-fixing scheme with a finite volume method. Previous results are extended and comparisons are made with linear models that work well under certain parameter regimes. Finally, a model for a multi-layered drug delivery device is suggested, which allows for more flexible control of drug release.
Resumo:
This chapter proposes a conceptual model for optimal development of needed capabilities for the contemporary knowledge economy. We commence by outlining key capability requirements of the 21st century knowledge economy, distinguishing these from those suited to the earlier stages of the knowledge economy. We then discuss the extent to which higher education currently caters to these requirements and then put forward a new model for effective knowledge economy capability learning. The core of this model is the development of an adaptive and adaptable career identity, which is created through a reflective process of career self-management, drawing upon data from the self and the world of work. In turn, career identity drives the individual’s process of skill and knowledge acquisition, including deep disciplinary knowledge. The professional capability learning thus acquired includes disciplinary skill and knowledge sets, generic skills, and also skills for the knowledge economy, including disciplinary agility, social network capability, and enterprise skills. In the final part of this chapter, we envision higher education systems that embrace the model, and suggest steps that could be taken toward making the development of knowledge economy capabilities an integral part of the university experience.
Resumo:
Urban sustainability and sustainable urban development concepts have been identified as the ultimate goal of many contemporary planning endeavours and have become central concepts on which the urban development policies are formulated. In the confinement of these concepts, land use and transport integration has been highlighted as one of the most important policy objectives considering the interrelationship between them and available intervention means of planning. While its interpretation varies, in Australia, it has been embraced as integration of land use and transport planning/policies and been an integral part of regional and local plans. Accordingly, a number of principles have been defined to guide its implementation, to name a few, planning for compact and connected urban development, encouraging active transport modes, creation of mixed-use activity centres and public transport precincts, provision of high quality public transport services, and enhancing character and amenity of urban areas. However, there is lack of an evaluation framework to measure the extent of achievement of implementation of these principles. In pursuit of filling this gap, this study aims to devise an evaluation framework to measure the performance of urban settings according to the integration principles in South East Queensland, Australia context and to demarcate problematic areas which can be intervened by planning tools...
Resumo:
Introduction QC and EQA are integral to good pathology laboratory practice. Medical Laboratory Science students undertake a project exploring internal QC and EQA procedures used in chemical pathology laboratories. Each student represents an individual lab and the class group represents the peer group of labs performing the same assay using the same method. Methods Using a manual BCG assay for serum albumin, normal and abnormal controls are run with a patient sample over 7 weeks. The QC results are assessed each week using calculated z-scores and both 2S & 3S control rules to determine whether a run is ‘in control’. At the end of the 7 weeks a completed LJ chart is assessed using the Westgard Multirules. Students investigate causes of error and the implications for both lab practice and patient care if runs are not ‘in control’. Twice in the 7 weeks two EQA samples (with target values unknown) are assayed alongside the weekly QC and patient samples. Results from each student are collated and form the basis of an EQA program. ALP are provided and students complete a Youden Plot, which is used to analyse the performance of each ‘lab’ and the method to identify bias. Students explore the concept of possible clinical implications of a biased method and address the actions that should be taken if a lab is not in consensus with the peer group. Conclusion This project is a model of ‘real world’ practice in which student demonstrate an understanding of the importance of QC procedures in a pathology laboratory, apply and interpret statistics and QC rules and charts, apply critical thinking and analytical skills to quality performance data to make recommendations for further practice and improve their technical competence and confidence.
Resumo:
Laminar two-dimensional natural convection boundary-layer flow of non-Newtonian fluids along an isothermal horizontal circular cylinder has been studied using a modified power-law viscosity model. In this model, there are no unrealistic limits of zero or infinite viscosity. Therefore, the boundary-layer equations can be solved numerically by using marching order implicit finite difference method with double sweep technique. Numerical results are presented for the case of shear-thinning as well as shear thickening fluids in terms of the fluid velocity and temperature distributions, shear stresses and rate of heat transfer in terms of the local skin-friction and local Nusselt number respectively.
Resumo:
A new optimal control model of the interactions between a growing tumour and the host immune system along with an immunotherapy treatment strategy is presented. The model is based on an ordinary differential equation model of interactions between the growing tu- mour and the natural killer, cytotoxic T lymphocyte and dendritic cells of the host immune system, extended through the addition of a control function representing the application of a dendritic cell treat- ment to the system. The numerical solution of this model, obtained from a multi species Runge–Kutta forward-backward sweep scheme, is described. We investigate the effects of varying the maximum al- lowed amount of dendritic cell vaccine administered to the system and find that control of the tumour cell population is best effected via a high initial vaccine level, followed by reduced treatment and finally cessation of treatment. We also found that increasing the strength of the dendritic cell vaccine causes an increase in the number of natural killer cells and lymphocytes, which in turn reduces the growth of the tumour.
Resumo:
A dual-scale model of the torrefaction of wood was developed and used to study industrial configurations. At the local scale, the computational code solves the coupled heat and mass transfer and the thermal degradation mechanisms of the wood components. At the global scale, the two-way coupling between the boards and the stack channels is treated as an integral component of the process. This model is used to investigate the effect of the stack configuration on the heat treatment of the boards. The simulations highlight that the exothermic reactions occurring in each single board can be accumulated along the stack. This phenomenon may result in a dramatic eterogeneity of the process and poses a serious risk of thermal runaway, which is often observed in industrial plants. The model is used to explain how thermal runaway can be lowered by increasing the airflow velocity, the sticker thickness or by gas flow reversal.
Resumo:
A fractional differential equation is used to describe a fractal model of mobile/immobile transport with a power law memory function. This equation is the limiting equation that governs continuous time random walks with heavy tailed random waiting times. In this paper, we firstly propose a finite difference method to discretize the time variable and obtain a semi-discrete scheme. Then we discuss its stability and convergence. Secondly we consider a meshless method based on radial basis functions (RBFs) to discretize the space variable. In contrast to conventional FDM and FEM, the meshless method is demonstrated to have distinct advantages: calculations can be performed independent of a mesh, it is more accurate and it can be used to solve complex problems. Finally the convergence order is verified from a numerical example which is presented to describe a fractal model of mobile/immobile transport process with different problem domains. The numerical results indicate that the present meshless approach is very effective for modeling and simulating fractional differential equations, and it has good potential in the development of a robust simulation tool for problems in engineering and science that are governed by various types of fractional differential equations.
Resumo:
Achieving sustainable urban development is identified as one ultimate goal of many contemporary planning endeavours and has become central to formulation of urban planning policies. Within this concept, land-use and transport integration is highlighted as one of the most important and attainable policy objectives. In many cities, integration is embraced as an integral part of local development plans, and a number of key integration principles are identified. However, the lack of available evaluation methods to measure extent of urban sustainability levels prevents successful implementation of these principles. This paper introduces a new indicator-based spatial composite indexing model developed to measure sustainability performance of urban settings by taking into account land-use and transport integration principles. Model indicators are chosen via a thorough selection process in line with key principles of land-use and transport integration. These indicators are grouped into categories and themes according to their topical relevance. These indicators are then aggregated to form a spatial composite index to portray an overview of the sustainability performance of the pilot study area used for model demonstration. The study results revealed that the model is a practical instrument for evaluating success of local integration policies and visualizing sustainability performance of built environments and useful in both identifying problematic areas as well as formulating policy interventions.