980 resultados para correlation modelling
Resumo:
With the growth of the Web, E-commerce activities are also becoming popular. Product recommendation is an effective way of marketing a product to potential customers. Based on a user’s previous searches, most recommendation methods employ two dimensional models to find relevant items. Such items are then recommended to a user. Further too many irrelevant recommendations worsen the information overload problem for a user. This happens because such models based on vectors and matrices are unable to find the latent relationships that exist between users and searches. Identifying user behaviour is a complex process, and usually involves comparing searches made by him. In most of the cases traditional vector and matrix based methods are used to find prominent features as searched by a user. In this research we employ tensors to find relevant features as searched by users. Such relevant features are then used for making recommendations. Evaluation on real datasets show the effectiveness of such recommendations over vector and matrix based methods.
Resumo:
A better understanding of the behaviour of prepared cane and bagasse, especially the ability to model the mechanical behaviour of bagasse as it is squeezed in a milling unit to extract juice, would help identify how to improve the current milling process; for example to reduce final bagasse moisture. Previous investigations have proven with certainty that juice flow through bagasse obeys Darcy’s permeability law, that the grip of the rough surface of the grooves on the bagasse can be represented by the Mohr- Coulomb failure criterion for soils, and that the internal mechanical behaviour of the bagasse can be represented by critical state behaviour similar to that of sand and clay. Current Finite Element Models (FEM) available in commercial software have adequate permeability models. However, commercial software does not contain an adequate mechanical model for bagasse. Progress has been made in the last ten years towards implementing a mechanical model for bagasse in finite element software code. This paper builds on that progress and carries out a further step towards obtaining an adequate material model. In particular, the prediction of volume change during shearing of normally consolidated final bagasse is addressed.
Resumo:
This paper outlines a study to determine the correlation between the LA10(18hour) and other road traffic noise indicators. It is based on a database comprising of 404 measurement locations including 947 individual days of valid noise measurements across numerous circumstances taken between November 2001 and November 2007. This paper firstly discusses the need and constraints on the indicators and their nature of matching a suitable indicator to the various road traffic noise dynamical characteristics. The paper then presents a statistical analysis of the road traffic noise monitoring data, correlating various indicators with the LA10(18hour) statistical indicator and provides a comprehensive table of linear correlations. There is an extended analysis on relationships across the night time period. The paper concludes with a discussion on the findings.
Resumo:
Technologies and languages for integrated processes are a relatively recent innovation. Over that period many divergent waves of innovation have transformed process integration. Like sockets and distributed objects, early workflow systems ordered programming interfaces that connected the process modelling layer to any middleware. BPM systems emerged later, connecting the modelling world to middleware through components. While BPM systems increased ease of use (modelling convenience), long-standing and complex interactions involving many process instances remained di±cult to model. Enterprise Service Buses (ESBs), followed, connecting process models to heterogeneous forms of middleware. ESBs, however, generally forced modellers to choose a particular underlying middleware and to stick to it, despite their ability to connect with many forms of middleware. Furthermore ESBs encourage process integrations to be modelled on their own, logically separate from the process model. This can lead to the inability to reason about long standing conversations at the process layer. Technologies and languages for process integration generally lack formality. This has led to arbitrariness in the underlying language building blocks. Conceptual holes exist in a range of technologies and languages for process integration and this can lead to customer dissatisfaction and failure to bring integration projects to reach their potential. Standards for process integration share similar fundamental flaws to languages and technologies. Standards are also in direct competition with other standards causing a lack of clarity. Thus the area of greatest risk in a BPM project remains process integration, despite major advancements in the technology base. This research examines some fundamental aspects of communication middleware and how these fundamental building blocks of integration can be brought to the process modelling layer in a technology agnostic manner. This way process modelling can be conceptually complete without becoming stuck in a particular middleware technology. Coloured Petri nets are used to define a formal semantics for the fundamental aspects of communication middleware. They provide the means to define and model the dynamic aspects of various integration middleware. Process integration patterns are used as a tool to codify common problems to be solved. Object Role Modelling is a formal modelling technique that was used to define the syntax of a proposed process integration language. This thesis provides several contributions to the field of process integration. It proposes a framework defining the key notions of integration middleware. This framework provides a conceptual foundation upon which a process integration language could be built. The thesis defines an architecture that allows various forms of middleware to be aggregated and reasoned about at the process layer. This thesis provides a comprehensive set of process integration patterns. These constitute a benchmark for the kinds of problems a process integration language must support. The thesis proposes a process integration modelling language and a partial implementation that is able to enact the language. A process integration pilot project in a German hospital is brie°y described at the end of the thesis. The pilot is based on ideas in this thesis.
Resumo:
Modelling the power systems load is a challenge since the load level and composition varies with time. An accurate load model is important because there is a substantial component of load dynamics in the frequency range relevant to system stability. The composition of loads need to be charaterised because the time constants of composite loads affect the damping contributions of the loads to power system oscillations, and their effects vary with the time of the day, depending on the mix of motors loads. This chapter has two main objectives: 1) describe the load modelling in small signal using on-line measurements; and 2) present a new approach to develop models that reflect the load response to large disturbances. Small signal load characterisation based on on-line measurements allows predicting the composition of load with improved accuracy compared with post-mortem or classical load models. Rather than a generic dynamic model for small signal modelling of the load, an explicit induction motor is used so the performance for larger disturbances can be more reliably inferred. The relation between power and frequency/voltage can be explicitly formulated and the contribution of induction motors extracted. One of the main features of this work is the induction motor component can be associated to nominal powers or equivalent motors
Resumo:
Handling information overload online, from the user's point of view is a big challenge, especially when the number of websites is growing rapidly due to growth in e-commerce and other related activities. Personalization based on user needs is the key to solving the problem of information overload. Personalization methods help in identifying relevant information, which may be liked by a user. User profile and object profile are the important elements of a personalization system. When creating user and object profiles, most of the existing methods adopt two-dimensional similarity methods based on vector or matrix models in order to find inter-user and inter-object similarity. Moreover, for recommending similar objects to users, personalization systems use the users-users, items-items and users-items similarity measures. In most cases similarity measures such as Euclidian, Manhattan, cosine and many others based on vector or matrix methods are used to find the similarities. Web logs are high-dimensional datasets, consisting of multiple users, multiple searches with many attributes to each. Two-dimensional data analysis methods may often overlook latent relationships that may exist between users and items. In contrast to other studies, this thesis utilises tensors, the high-dimensional data models, to build user and object profiles and to find the inter-relationships between users-users and users-items. To create an improved personalized Web system, this thesis proposes to build three types of profiles: individual user, group users and object profiles utilising decomposition factors of tensor data models. A hybrid recommendation approach utilising group profiles (forming the basis of a collaborative filtering method) and object profiles (forming the basis of a content-based method) in conjunction with individual user profiles (forming the basis of a model based approach) is proposed for making effective recommendations. A tensor-based clustering method is proposed that utilises the outcomes of popular tensor decomposition techniques such as PARAFAC, Tucker and HOSVD to group similar instances. An individual user profile, showing the user's highest interest, is represented by the top dimension values, extracted from the component matrix obtained after tensor decomposition. A group profile, showing similar users and their highest interest, is built by clustering similar users based on tensor decomposed values. A group profile is represented by the top association rules (containing various unique object combinations) that are derived from the searches made by the users of the cluster. An object profile is created to represent similar objects clustered on the basis of their similarity of features. Depending on the category of a user (known, anonymous or frequent visitor to the website), any of the profiles or their combinations is used for making personalized recommendations. A ranking algorithm is also proposed that utilizes the personalized information to order and rank the recommendations. The proposed methodology is evaluated on data collected from a real life car website. Empirical analysis confirms the effectiveness of recommendations made by the proposed approach over other collaborative filtering and content-based recommendation approaches based on two-dimensional data analysis methods.
Resumo:
Mixture models are a flexible tool for unsupervised clustering that have found popularity in a vast array of research areas. In studies of medicine, the use of mixtures holds the potential to greatly enhance our understanding of patient responses through the identification of clinically meaningful clusters that, given the complexity of many data sources, may otherwise by intangible. Furthermore, when developed in the Bayesian framework, mixture models provide a natural means for capturing and propagating uncertainty in different aspects of a clustering solution, arguably resulting in richer analyses of the population under study. This thesis aims to investigate the use of Bayesian mixture models in analysing varied and detailed sources of patient information collected in the study of complex disease. The first aim of this thesis is to showcase the flexibility of mixture models in modelling markedly different types of data. In particular, we examine three common variants on the mixture model, namely, finite mixtures, Dirichlet Process mixtures and hidden Markov models. Beyond the development and application of these models to different sources of data, this thesis also focuses on modelling different aspects relating to uncertainty in clustering. Examples of clustering uncertainty considered are uncertainty in a patient’s true cluster membership and accounting for uncertainty in the true number of clusters present. Finally, this thesis aims to address and propose solutions to the task of comparing clustering solutions, whether this be comparing patients or observations assigned to different subgroups or comparing clustering solutions over multiple datasets. To address these aims, we consider a case study in Parkinson’s disease (PD), a complex and commonly diagnosed neurodegenerative disorder. In particular, two commonly collected sources of patient information are considered. The first source of data are on symptoms associated with PD, recorded using the Unified Parkinson’s Disease Rating Scale (UPDRS) and constitutes the first half of this thesis. The second half of this thesis is dedicated to the analysis of microelectrode recordings collected during Deep Brain Stimulation (DBS), a popular palliative treatment for advanced PD. Analysis of this second source of data centers on the problems of unsupervised detection and sorting of action potentials or "spikes" in recordings of multiple cell activity, providing valuable information on real time neural activity in the brain.
Resumo:
This paper argues for a renewed focus on statistical reasoning in the beginning school years, with opportunities for children to engage in data modelling. Results are reported from the first year of a 3-year longitudinal study in which three classes of first-grade children (6-year-olds) and their teachers engaged in data modelling activities. The theme of Looking after our Environment, part of the children’s science curriculum, provided the task context. The goals for the two activities addressed here included engaging children in core components of data modelling, namely, selecting attributes, structuring and representing data, identifying variation in data, and making predictions from given data. Results include the various ways in which children represented and re represented collected data, including attribute selection, and the metarepresentational competence they displayed in doing so. The “data lenses” through which the children dealt with informal inference (variation and prediction) are also reported.
Resumo:
Three-dimensional wagon train models have been developed for the crashworthiness analysis using multi-body dynamics approach. The contributions of the train size (number of wagon) to the frontal crash forces can be identified through the simulations. The effects of crash energy management (CEM) design and crash speed on train crashworthiness are examined. The CEM design can significantly improve the train crashworthiness and the consequential vehicle stability performance - reducing derailment risks.
Resumo:
The growth of solid tumours beyond a critical size is dependent upon angiogenesis, the formation of new blood vessels from an existing vasculature. Tumours may remain dormant at microscopic sizes for some years before switching to a mode in which growth of a supportive vasculature is initiated. The new blood vessels supply nutrients, oxygen, and access to routes by which tumour cells may travel to other sites within the host (metastasize). In recent decades an abundance of biological research has focused on tumour-induced angiogenesis in the hope that treatments targeted at the vasculature may result in a stabilisation or regression of the disease: a tantalizing prospect. The complex and fascinating process of angiogenesis has also attracted the interest of researchers in the field of mathematical biology, a discipline that is, for mathematics, relatively new. The challenge in mathematical biology is to produce a model that captures the essential elements and critical dependencies of a biological system. Such a model may ultimately be used as a predictive tool. In this thesis we examine a number of aspects of tumour-induced angiogenesis, focusing on growth of the neovasculature external to the tumour. Firstly we present a one-dimensional continuum model of tumour-induced angiogenesis in which elements of the immune system or other tumour-cytotoxins are delivered via the newly formed vessels. This model, based on observations from experiments by Judah Folkman et al., is able to show regression of the tumour for some parameter regimes. The modelling highlights a number of interesting aspects of the process that may be characterised further in the laboratory. The next model we present examines the initiation positions of blood vessel sprouts on an existing vessel, in a two-dimensional domain. This model hypothesises that a simple feedback inhibition mechanism may be used to describe the spacing of these sprouts with the inhibitor being produced by breakdown of the existing vessel's basement membrane. Finally, we have developed a stochastic model of blood vessel growth and anastomosis in three dimensions. The model has been implemented in C++, includes an openGL interface, and uses a novel algorithm for calculating proximity of the line segments representing a growing vessel. This choice of programming language and graphics interface allows for near-simultaneous calculation and visualisation of blood vessel networks using a contemporary personal computer. In addition the visualised results may be transformed interactively, and drop-down menus facilitate changes in the parameter values. Visualisation of results is of vital importance in the communication of mathematical information to a wide audience, and we aim to incorporate this philosophy in the thesis. As biological research further uncovers the intriguing processes involved in tumourinduced angiogenesis, we conclude with a comment from mathematical biologist Jim Murray, Mathematical biology is : : : the most exciting modern application of mathematics.
Resumo:
Abstract: LiteSteel beam (LSB) is a new cold-formed steel hollow flange channel beam produced using a patented manufacturing process involving simultaneous cold-forming and dual electric resistance welding. It has the beneficial characteristics of torsionally rigid closed rectangular flanges combined with economical fabrication processes from a single strip of high strength steel. Although the LSB sections are commonly used as flexural members, no research has been undertaken on the shear behaviour of LSBs. Therefore experimental and numerical studies were undertaken to investigate the shear behaviour and strength of LSBs. In this research finite element models of LSBs were developed to investigate their nonlinear shear behaviour including their buckling characteristics and ultimate shear strength. They were validated by comparing their results with available experimental results. The models provided full details of the shear buckling and strength characteristics of LSBs, and showed the presence of considerable improvements to web shear buckling in LSBs and associated post-buckling strength. This paper presents the details of the finite element models of LSBs and the results. Both finite element analysis and experimental results showed that the current design rules in cold-formed steel codes are very conservative for the shear design of LSBs. The ultimate shear capacities from finite element analyses confirmed the accuracy of proposed shear strength equations for LSBs based on the North American specification and DSM design equations. Developed finite element models were used to investigate the reduction to shear capacity of LSBs when full height web side plates were not used or when only one web side plate was used, and these results are also presented in this paper.
Resumo:
Recently an innovative composite panel system was developed, where a thin insulation layer was used externally between two plasterboards to improve the fire performance of light gauge cold-formed steel frame walls. In this research, finite-element thermal models of both the traditional light gauge cold-formed steel frame wall panels with cavity insulation and the new light gauge cold-formed steel frame composite wall panels were developed to simulate their thermal behaviour under standard and realistic fire conditions. Suitable apparent thermal properties of gypsum plasterboard, insulation materials and steel were proposed and used. The developed models were then validated by comparing their results with available fire test results. This article presents the details of the developed finite-element models of small-scale non-load-bearing light gauge cold-formed steel frame wall panels and the results of the thermal analysis. It has been shown that accurate finite-element models can be used to simulate the thermal behaviour of small-scale light gauge cold-formed steel frame walls with varying configurations of insulations and plasterboards. The numerical results show that the use of cavity insulation was detrimental to the fire rating of light gauge cold-formed steel frame walls, while the use of external insulation offered superior thermal protection to them. The effects of real fire conditions are also presented.
Resumo:
Seat pressure is known as a major factor of seat comfort in vehicles. In passenger vehicles, there is lacking research into the seat comfort of rear seat occupants. As accurate seat pressure measurement requires significant effort, simulation of seat pressure is evolving as a preferred method. However, analytic methods are based on complex finite element modeling and therefore are time consuming and involve high investment. Based on accurate anthropometric measurements of 64 male subjects and outboard rear seat pressure measurements in three different passenger vehicles, this study investigates if a set of parameters derived from seat pressure mapping are sensitive enough to differentiate between different seats and whether they correlate with anthropometry in linear models. In addition to the pressure map analysis, H-Points were measured with a coordinate measurement system based on palpated body landmarks and the range of H-Point locations in the three seats is provided. It was found that for the cushion, cushion contact area and cushion front area/force could be modeled by subject anthropometry,while only seatback contact area could be modeled based on anthropometry for all three vehicles. Major differences were found between the vehicles for other parameters.
Resumo:
The complex interaction of the bones of the foot has been explored in detail in recent years, which has led to the acknowledgement in the biomechanics community that the foot can no longer be considered as a single rigid segment. With the advance of motion analysis technology it has become possible to quantify the biomechanics of simplified units or segments that make up the foot. Advances in technology coupled with reducing hardware prices has resulted in the uptake of more advanced tools available for clinical gait analysis. The increased use of these techniques in clinical practice requires defined standards for modelling and reporting of foot and ankle kinematics. This systematic review aims to provide a critical appraisal of commonly used foot and ankle marker sets designed to assess kinematics and thus provide a theoretical background for the development of modelling standards.