891 resultados para Economic models.
Resumo:
- describe what is meant by socioeconomic differences in health, and the social and emotional determinants of health - understand how health inequalities are affected by the social and economic circumstances that people experience throughout their lives - discuss how factors such as living and working conditions, income, place and education can impact on health - identify actions for public health policy-makers that have the potential to make a difference in improving health outcomes within populations - appreciate the concept of social cohesion and social capital, and their role as potential protective factors in health - understand conceptual models that can assist in analysing these issues.
Resumo:
Evaluating the safety of different traffic facilities is a complex and crucial task. Microscopic simulation models have been widely used for traffic management but have been largely neglected in traffic safety studies. Micro simulation to study safety is more ethical and accessible than the traditional safety studies, which only assess historical crash data. However, current microscopic models are unable to mimic unsafe driver behavior, as they are based on presumptions of safe driver behavior. This highlights the need for a critical examination of the current microscopic models to determine which components and parameters have an effect on safety indicator reproduction. The question then arises whether these safety indicators are valid indicators of traffic safety. The safety indicators were therefore selected and tested for straight motorway segments in Brisbane, Australia. This test examined the capability of a micro-simulation model and presents a better understanding of micro-simulation models and how such models, in particular car following models can be enriched to present more accurate safety indicators.
Resumo:
Highway construction works have significant bearings on all aspects of sustainability. As they typically involve huge capital funds, stakeholders tend to place all interests on the financial justifications of the project, especially when embedding sustainability principles and practices may demand significant initial investment. Increasing public awareness and government policies demand that infrastructure projects respond to environmental challenges and people start to realise the negative consequences of not to pursue sustainability. Stakeholders are now keen to identify sustainable alternatives and financial implications of including them on a whole lifecycle basis. Therefore tools that aid the evaluation of investment options, such as provision of environmentally sustainable features in roads and highways, are highly desirable. Life-cycle cost analysis (LCCA) is generally recognised as a valuable approach for investment decision making for construction works. However to date it has limited application because the current LCCA models tend to focus on economic issues alone and are not able to deal with sustainability factors. This paper reports a research on identifying sustainability related factors in highway construction projects, in quantitative and qualitative forms of a multi-criteria analysis. These factors are then incorporated into existing LCCA models to produce a new sustainability based LCCA model with cost elements specific to sustainability measures. This presents highway project stakeholders a practical tool to evaluate investment decisions and reach an optimum balance between financial viability and sustainability deliverables.
Resumo:
Non-invasive vibration analysis has been used extensively to monitor the progression of dental implant healing and stabilization. It is now being considered as a method to monitor femoral implants in transfemoral amputees. This paper evaluates two modal analysis excitation methods and investigates their capabilities in detecting changes at the interface between the implant and the bone that occur during osseointegration. Excitation of bone-implant physical models with the electromagnetic shaker provided higher coherence values and a greater number of modes over the same frequency range when compared to the impact hammer. Differences were detected in the natural frequencies and fundamental mode shape of the model when the fit of the implant was altered in the bone. The ability to detect changes in the model dynamic properties demonstrates the potential of modal analysis in this application and warrants further investigation.
Resumo:
With the increasing number of XML documents in varied domains, it has become essential to identify ways of finding interesting information from these documents. Data mining techniques were used to derive this interesting information. Mining on XML documents is impacted by its model due to the semi-structured nature of these documents. Hence, in this chapter we present an overview of the various models of XML documents, how these models were used for mining and some of the issues and challenges in these models. In addition, this chapter also provides some insights into the future models of XML documents for effectively capturing the two important features namely structure and content of XML documents for mining.
Resumo:
Existing recommendation systems often recommend products to users by capturing the item-to-item and user-to-user similarity measures. These types of recommendation systems become inefficient in people-to-people networks for people to people recommendation that require two way relationship. Also, existing recommendation methods use traditional two dimensional models to find inter relationships between alike users and items. It is not efficient enough to model the people-to-people network with two-dimensional models as the latent correlations between the people and their attributes are not utilized. In this paper, we propose a novel tensor decomposition-based recommendation method for recommending people-to-people based on users profiles and their interactions. The people-to-people network data is multi-dimensional data which when modeled using vector based methods tend to result in information loss as they capture either the interactions or the attributes of the users but not both the information. This paper utilizes tensor models that have the ability to correlate and find latent relationships between similar users based on both information, user interactions and user attributes, in order to generate recommendations. Empirical analysis is conducted on a real-life online dating dataset. As demonstrated in results, the use of tensor modeling and decomposition has enabled the identification of latent correlations between people based on their attributes and interactions in the network and quality recommendations have been derived using the 'alike' users concept.
Resumo:
Continuum, partial differential equation models are often used to describe the collective motion of cell populations, with various types of motility represented by the choice of diffusion coefficient, and cell proliferation captured by the source terms. Previously, the choice of diffusion coefficient has been largely arbitrary, with the decision to choose a particular linear or nonlinear form generally based on calibration arguments rather than making any physical connection with the underlying individual-level properties of the cell motility mechanism. In this work we provide a new link between individual-level models, which account for important cell properties such as varying cell shape and volume exclusion, and population-level partial differential equation models. We work in an exclusion process framework, considering aligned, elongated cells that may occupy more than one lattice site, in order to represent populations of agents with different sizes. Three different idealizations of the individual-level mechanism are proposed, and these are connected to three different partial differential equations, each with a different diffusion coefficient; one linear, one nonlinear and degenerate and one nonlinear and nondegenerate. We test the ability of these three models to predict the population level response of a cell spreading problem for both proliferative and nonproliferative cases. We also explore the potential of our models to predict long time travelling wave invasion rates and extend our results to two dimensional spreading and invasion. Our results show that each model can accurately predict density data for nonproliferative systems, but that only one does so for proliferative systems. Hence great care must be taken to predict density data for with varying cell shape.
Resumo:
To examine socioeconomic differences in the frequency and types of takeaway foods consumed. Cross-sectional postal survey. Participants were asked about their usual consumption of overall takeaway food (< four times a month, or ≥ four times a month) and 22 specific takeaway food items (< once a month, or ≥ once a month): these latter foods were grouped into “healthy” and “less healthy” choices. Socioeconomic position was measured using education and equivalised household income and differences in takeaway food consumption were assessed by calculating prevalence ratios using log binomial regression. Adults aged 25–64 years from Brisbane, Australia were randomly selected from the electoral roll (N = 903, 63.7% response rate). Compared with their more educated counterparts, the least educated were more regular consumers of overall takeaway food, fruit/vegetable juice, and less regular consumers of sushi. For the “less healthy” items, the least educated more regularly consumed potato chips, savoury pies, fried chicken, and non-diet soft drinks; however, the least educated were less likely to consume curry. Household income was not associated with overall takeaway consumption. The lowest income group were more regular consumers of fruit/vegetable juice compared with the highest income group. Among the “less healthy” items, the lowest income group were more regular consumers of fried fish, ice-cream, and milk shakes, while curry was consumed less regularly. The frequency and types of takeaway foods consumed by socioeconomically disadvantaged groups may contribute to inequalities in overweight/obesity and chronic disease.
Resumo:
The quality of conceptual business process models is highly relevant for the design of corresponding information systems. In particular, a precise measurement of model characteristics can be beneficial from a business perspective, helping to save costs thanks to early error detection. This is just as true from a software engineering point of view. In this latter case, models facilitate stakeholder communication and software system design. Research has investigated several proposals as regards measures for business process models, from a rather correlational perspective. This is helpful for understanding, for example size and complexity as general driving forces of error probability. Yet, design decisions usually have to build on thresholds, which can reliably indicate that a certain counter-action has to be taken. This cannot be achieved only by providing measures; it requires a systematic identification of effective and meaningful thresholds. In this paper, we derive thresholds for a set of structural measures for predicting errors in conceptual process models. To this end, we use a collection of 2,000 business process models from practice as a means of determining thresholds, applying an adaptation of the ROC curves method. Furthermore, an extensive validation of the derived thresholds was conducted by using 429 EPC models from an Australian financial institution. Finally, significant thresholds were adapted to refine existing modeling guidelines in a quantitative way.
Resumo:
Many modern business environments employ software to automate the delivery of workflows; whereas, workflow design and generation remains a laborious technical task for domain specialists. Several differ- ent approaches have been proposed for deriving workflow models. Some approaches rely on process data mining approaches, whereas others have proposed derivations of workflow models from operational struc- tures, domain specific knowledge or workflow model compositions from knowledge-bases. Many approaches draw on principles from automatic planning, but conceptual in context and lack mathematical justification. In this paper we present a mathematical framework for deducing tasks in workflow models from plans in mechanistic or strongly controlled work environments, with a focus around automatic plan generations. In addition, we prove an associative composition operator that permits crisp hierarchical task compositions for workflow models through a set of mathematical deduction rules. The result is a logical framework that can be used to prove tasks in workflow hierarchies from operational information about work processes and machine configurations in controlled or mechanistic work environments.
Resumo:
Nowadays, business process management is an important approach for managing organizations from an operational perspective. As a consequence, it is common to see organizations develop collections of hundreds or even thousands of business process models. Such large collections of process models bring new challenges and provide new opportunities, as the knowledge that they encapsulate requires to be properly managed. Therefore, a variety of techniques for managing large collections of business process models is being developed. The goal of this paper is to provide an overview of the management techniques that currently exist, as well as the open research challenges that they pose.
Resumo:
The development of highway infrastructure typically requires major capital input over a long period. This often causes serious financial constraints for investors. The push for sustainability has added new dimensions to the complexity in the evaluation of highway projects, particularly on the cost front. This makes the determination of long-term viability even more a precarious exercise. Life-cycle costing analysis (LCCA) is generally recognised as a valuable tool for the assessment of financial decisions on construction works. However to date, existing LCCA models are deficient in dealing with sustainability factors, particularly for infrastructure projects due to their inherent focus on the economic issues alone. This research probed into the major challenges of implementing sustainability in highway infrastructure development in terms of financial concerns and obligations. Using results of research through literature review, questionnaire survey of industry stakeholders and semi-structured interview of senior practitioners involved in highway infrastructure development, the research identified the relative importance of cost components relating to sustainability measures and on such basis, developed ways of improving existing LCCA models to incorporate sustainability commitments into long-term financial management. On such a platform, a decision support model incorporated Fuzzy Analytical Hierarchy Process and LCCA for the evaluation of the specific cost components most concerned by infrastructure stakeholders. Two real highway infrastructure projects in Australia were then used for testing, application and validation, before the decision support model was finalised. Improved industry understanding and tools such as the developed model will lead to positive sustainability deliverables while ensuring financial viability over the lifecycle of highway infrastructure projects.
Resumo:
There is an intimate interconnectivity between policy guidelines defining reform and the delineation of what research methods would be subsequently applied to determine reform success. Research is guided as much by the metaphors describing it as by the ensuing empirical definition of actions of results obtained from it. In a call for different reform policy metaphors Lumby and English (2010) note, “The primary responsibility for the parlous state of education... lies with the policy makers that have racked our schools with reductive and dehumanizing processes, following the metaphors of market efficiency, and leadership models based on accounting and the characteristics of machine bureaucracy” (p. 127)
Resumo:
This paper presents an approach to building an observation likelihood function from a set of sparse, noisy training observations taken from known locations by a sensor with no obvious geometric model. The basic approach is to fit an interpolant to the training data, representing the expected observation, and to assume additive sensor noise. This paper takes a Bayesian view of the problem, maintaining a posterior over interpolants rather than simply the maximum-likelihood interpolant, giving a measure of uncertainty in the map at any point. This is done using a Gaussian process framework. To validate the approach experimentally, a model of an environment is built using observations from an omni-directional camera. After a model has been built from the training data, a particle filter is used to localise while traversing this environment
Resumo:
It is to estimate the trend of suicide rate changes during the past three decades in China and try to identify its social and economic correlates. Official data of suicide rates and economic indexes during 1982–2005 from Shandong Province of China were analyzed. The suicide data were categorized for the rural / urban location and gender, and the economic indexes include GDP, GDP per capita, rural income, and urban income, all adjusted for inflation. We found a significant increase of economic development and decrease of suicide rates over the past decades under study. The suicide rate decrease is correlated with the tremendous growth of economy. The unusual decrease of Chinese suicide rates in the past decades is accounted for within the Chinese cultural contexts and maybe by the Strain Theory of Suicide.