993 resultados para Modelling goal


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Caribbean region remains highly vulnerable to the impacts of climate change. In order to assess the social and economic consequences of climate change for the region, the Economic Commission for Latin America and the Caribbean( ECLAC) has developed a model for this purpose. The model is referred to as the Climate Impact Assessment Model (ECLAC-CIAM) and is a tool that can simultaneously assess multiple sectoral climate impacts specific to the Caribbean as a whole and for individual countries. To achieve this goal, an Integrated Assessment Model (IAM) with a Computable General Equilibrium Core was developed comprising of three modules to be executed sequentially. The first of these modules defines the type and magnitude of economic shocks on the basis of a climate change scenario, the second module is a global Computable General Equilibrium model with a special regional and industrial classification and the third module processes the output of the CGE model to get more disaggregated results. The model has the potential to produce several economic estimates but the current default results include percentage change in real national income for individual Caribbean states which provides a simple measure of welfare impacts. With some modifications, the model can also be used to consider the effects of single sectoral shocks such as (Land, Labour, Capital and Tourism) on the percentage change in real national income. Ultimately, the model is envisioned as an evolving tool for assessing the impact of climate change in the Caribbean and as a guide to policy responses with respect to adaptation strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Every year, thousand of surgical treatments are performed in order to fix up or completely substitute, where possible, organs or tissues affected by degenerative diseases. Patients with these kind of illnesses stay long times waiting for a donor that could replace, in a short time, the damaged organ or the tissue. The lack of biological alternates, related to conventional surgical treatments as autografts, allografts, e xenografts, led the researchers belonging to different areas to collaborate to find out innovative solutions. This research brought to a new discipline able to merge molecular biology, biomaterial, engineering, biomechanics and, recently, design and architecture knowledges. This discipline is named Tissue Engineering (TE) and it represents a step forward towards the substitutive or regenerative medicine. One of the major challenge of the TE is to design and develop, using a biomimetic approach, an artificial 3D anatomy scaffold, suitable for cells adhesion that are able to proliferate and differentiate themselves as consequence of the biological and biophysical stimulus offered by the specific tissue to be replaced. Nowadays, powerful instruments allow to perform analysis day by day more accurateand defined on patients that need more precise diagnosis and treatments.Starting from patient specific information provided by TC (Computed Tomography) microCT and MRI(Magnetic Resonance Imaging), an image-based approach can be performed in order to reconstruct the site to be replaced. With the aid of the recent Additive Manufacturing techniques that allow to print tridimensional objects with sub millimetric precision, it is now possible to practice an almost complete control of the parametrical characteristics of the scaffold: this is the way to achieve a correct cellular regeneration. In this work, we focalize the attention on a branch of TE known as Bone TE, whose the bone is main subject. Bone TE combines osteoconductive and morphological aspects of the scaffold, whose main properties are pore diameter, structure porosity and interconnectivity. The realization of the ideal values of these parameters represents the main goal of this work: here we'll a create simple and interactive biomimetic design process based on 3D CAD modeling and generative algorithmsthat provide a way to control the main properties and to create a structure morphologically similar to the cancellous bone. Two different typologies of scaffold will be compared: the first is based on Triply Periodic MinimalSurface (T.P.M.S.) whose basic crystalline geometries are nowadays used for Bone TE scaffolding; the second is based on using Voronoi's diagrams and they are more often used in the design of decorations and jewellery for their capacity to decompose and tasselate a volumetric space using an heterogeneous spatial distribution (often frequent in nature). In this work, we will show how to manipulate the main properties (pore diameter, structure porosity and interconnectivity) of the design TE oriented scaffolding using the implementation of generative algorithms: "bringing back the nature to the nature".

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND AIMS Hepatitis C (HCV) is a leading cause of morbidity and mortality in people who live with HIV. In many countries, access to direct acting antiviral agents to treat HCV is restricted to individuals with advanced liver disease (METAVIR stage F3 or F4). Our goal was to estimate the long term impact of deferring HCV treatment for men who have sex with men (MSM) who are coinfected with HIV and often have multiple risk factors for liver disease progression. METHODS We developed an individual-based model of liver disease progression in HIV/HCV coinfected men who have sex with men. We estimated liver-related morbidity and mortality as well as the median time spent with replicating HCV infection when individuals were treated in liver fibrosis stages F0, F1, F2, F3 or F4 on the METAVIR scale. RESULTS The percentage of individuals who died of liver-related complications was 2% if treatment was initiated in F0 or F1. It increased to 3% if treatment was deferred until F2, 7% if it was deferred until F3 and 22% if deferred until F4. The median time individuals spent with replicating HCV increased from 5 years if treatment was initiated in F2 to almost 15 years if it was deferred until F4. CONCLUSIONS Deferring HCV therapy until advanced liver fibrosis is established could increase liver-related morbidity and mortality in HIV/HCV coinfected individuals, and substantially prolong the time individuals spend with replicating HCV infection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this paper is to show how mathematics and computational science can help to design not only the geometry but also the operation conditions of different parts of a pulverized coal power plant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a result of studies examining factors involved in the learning process, various structural models have been developed to explain the direct and indirect effects that occur between the variables in these models. The objective was to evaluate a structural model of cognitive and motivational variables predicting academic achievement, including general intelligence, academic self-concept, goal orientations, effort and learning strategies. The sample comprised of 341 Spanish students in the first year of compulsory secondary education. Different tests and questionnaires were used to evaluate each variable, and Structural Equation Modelling (SEM) was applied to contrast the relationships of the initial model. The model proposed had a satisfactory fit, and all the hypothesised relationships were significant. General intelligence was the variable most able to explain academic achievement. Also important was the direct influence of academic self-concept on achievement, goal orientations and effort, as well as the mediating ability of effort and learning strategies between academic goals and final achievement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Society today is completely dependent on computer networks, the Internet and distributed systems, which place at our disposal the necessary services to perform our daily tasks. Subconsciously, we rely increasingly on network management systems. These systems allow us to, in general, maintain, manage, configure, scale, adapt, modify, edit, protect, and enhance the main distributed systems. Their role is secondary and is unknown and transparent to the users. They provide the necessary support to maintain the distributed systems whose services we use every day. If we do not consider network management systems during the development stage of distributed systems, then there could be serious consequences or even total failures in the development of the distributed system. It is necessary, therefore, to consider the management of the systems within the design of the distributed systems and to systematise their design to minimise the impact of network management in distributed systems projects. In this paper, we present a framework that allows the design of network management systems systematically. To accomplish this goal, formal modelling tools are used for modelling different views sequentially proposed of the same problem. These views cover all the aspects that are involved in the system; based on process definitions for identifying responsible and defining the involved agents to propose the deployment in a distributed architecture that is both feasible and appropriate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The lower urinary tract is one of the most complex biological systems of the human body as it involved hydrodynamic properties of urine and muscle. Moreover, its complexity is increased to be managed by voluntary and involuntary neural systems. In this paper, a mathematical model of the lower urinary tract it is proposed as a preliminary study to better understand its functioning. Furthermore, another goal of that mathematical model proposal is to provide a basis for developing artificial control systems. Lower urinary tract is comprised of two interacting systems: the mechanical system and the neural regulator. The latter has the function of controlling the mechanical system to perform the voiding process. The results of the tests reproduce experimental data with high degree of accuracy. Also, these results indicate that simulations not only with healthy patients but also of patients with dysfunctions with neurological etiology present urodynamic curves very similar to those obtained in clinical studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates how demographic (socioeconomic) and land-use (physical and environmental) data can be integrated within a decision support framework to formulate and evaluate land-use planning scenarios. A case-study approach is undertaken with land-use planning scenarios for a rapidly growing coastal area in Australia, the Shire of Hervey Bay. The town and surrounding area require careful planning of the future urban growth between competing land uses. Three potential urban growth scenarios are put forth to address this issue. Scenario A ('continued growth') is based on existing socioeconomic trends. Scenario B ('maximising rates base') is derived using optimisation modelling of land-valuation data. Scenario C ('sustainable development') is derived using a number of social, economic, and environmental factors and assigning weightings of importance to each factor using a multiple criteria analysis approach. The land-use planning scenarios are presented through the use of maps and tables within a geographical information system, which delineate future possible land-use allocations up until 2021. The planning scenarios are evaluated by using a goal-achievement matrix approach. The matrix is constructed with a number of criteria derived from key policy objectives outlined in the regional growth management framework and town planning schemes. The authors of this paper examine the final efficiency scores calculated for each of the three planning scenarios and discuss the advantages and disadvantages of the three land-use modelling approaches used to formulate the final scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A complete workflow specification requires careful integration of many different process characteristics. Decisions must be made as to the definitions of individual activities, their scope, the order of execution that maintains the overall business process logic, the rules governing the discipline of work list scheduling to performers, identification of time constraints and more. The goal of this paper is to address an important issue in workflows modelling and specification, which is data flow, its modelling, specification and validation. Researchers have neglected this dimension of process analysis for some time, mainly focussing on structural considerations with limited verification checks. In this paper, we identify and justify the importance of data modelling in overall workflows specification and verification. We illustrate and define several potential data flow problems that, if not detected prior to workflow deployment may prevent the process from correct execution, execute process on inconsistent data or even lead to process suspension. A discussion on essential requirements of the workflow data model in order to support data validation is also given..

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes an application of decoupled probabilistic world modeling to achieve team planning. The research is based on the principle that tbe action selection mechanism of a member in a robot team cm select am effective action if a global world model is available to all team members. In the real world, the sensors are imprecise, and are individual to each robot, hence providing each robot a partial and unique view about the environment. We address this problem by creating a probabilistic global view on each agent by combining the perceptual information from each robot. This probsbilistie view forms the basis for selecting actions to achieve the team goal in a dynamic environment. Experiments have been carried ont to investigate the effectiveness of this principle using custom-built robots for real world performance, in addition, to extensive simulation results. The results show an improvement in team effectiveness when using probabilistic world modeling based on perception sharing for team planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last two decades there have been substantial developments in the mathematical theory of inverse optimization problems, and their applications have expanded greatly. In parallel, time series analysis and forecasting have become increasingly important in various fields of research such as data mining, economics, business, engineering, medicine, politics, and many others. Despite the large uses of linear programming in forecasting models there is no a single application of inverse optimization reported in the forecasting literature when the time series data is available. Thus the goal of this paper is to introduce inverse optimization into forecasting field, and to provide a streamlined approach to time series analysis and forecasting using inverse linear programming. An application has been used to demonstrate the use of inverse forecasting developed in this study. © 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presents a prototype modelling methodology that provides a generic approach to the creation of quantitative models of the relationships between a working environment, the direct workers and their subsequent performance. Once created for an organisation, such models can provide a prediction of how the behaviour of their workers will alter in response to changes in their working environment. The goal of this work is to improve the decision processes used in the design of the working environment. Through improving such processes, companies will gain better performance from their direct workers, and so improve business competitiveness. This paper first presents the need to model the behaviour of direct workers in manufacturing environments. To begin to address this need, a simplistic modelling framework is developed, and then this is expanded to provide a detailed modelling methodology. There then follows a description of an industrial evaluation of this methodology at Ford Motor Company. This modelling methodology has been assessed in this case study and has been found to be valid in this case. There are many challenges that this theme of research needs to address. The work described in this paper has made an important first step in this area, having gone some way to establishing a generic methodology and illustrating its potential value. Our future work will build on this foundation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Standard Cosmological Model is generally accepted by the scientific community, there are still an amount of unresolved issues. From the observable characteristics of the structures in the Universe,it should be possible to impose constraints on the cosmological parameters. Cosmic Voids (CV) are a major component of the LSS and have been shown to possess great potential for constraining DE and testing theories of gravity. But a gap between CV observations and theory still persists. A theoretical model for void statistical distribution as a function of size exists (SvdW) However, the SvdW model has been unsuccesful in reproducing the results obtained from cosmological simulations. This undermines the possibility of using voids as cosmological probes. The goal of our thesis work is to cover the gap between theoretical predictions and measured distributions of cosmic voids. We develop an algorithm to identify voids in simulations,consistently with theory. We inspecting the possibilities offered by a recently proposed refinement of the SvdW (the Vdn model, Jennings et al., 2013). Comparing void catalogues to theory, we validate the Vdn model, finding that it is reliable over a large range of radii, at all the redshifts considered and for all the cosmological models inspected. We have then searched for a size function model for voids identified in a distribution of biased tracers. We find that, naively applying the same procedure used for the unbiased tracers to a halo mock distribution does not provide success- full results, suggesting that the Vdn model requires to be reconsidered when dealing with biased samples. Thus, we test two alternative exten- sions of the model and find that two scaling relations exist: both the Dark Matter void radii and the underlying Dark Matter density contrast scale with the halo-defined void radii. We use these findings to develop a semi-analytical model which gives promising results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the leading motivations behind the multilingual semantic web is to make resources accessible digitally in an online global multilingual context. Consequently, it is fundamental for knowledge bases to find a way to manage multilingualism and thus be equipped with those procedures for its conceptual modelling. In this context, the goal of this paper is to discuss how common-sense knowledge and cultural knowledge are modelled in a multilingual framework. More particularly, multilingualism and conceptual modelling are dealt with from the perspective of FunGramKB, a lexico-conceptual knowledge base for natural language understanding. This project argues for a clear division between the lexical and the conceptual dimensions of knowledge. Moreover, the conceptual layer is organized into three modules, which result from a strong commitment towards capturing semantic knowledge (Ontology), procedural knowledge (Cognicon) and episodic knowledge (Onomasticon). Cultural mismatches are discussed and formally represented at the three conceptual levels of FunGramKB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of this paper is to expose and validate a methodology to design efficient automatic controllers for irrigation canals, based on the Saint-Venant model. This model-based methodology enables to design controllers at the design stage (when the canal is not already built). The methodology is applied on an experimental canal located in Portugal. First the full nonlinear PDE model is calibrated, using a single steady-state experiment. The model is then linearized around a functioning point, in order to design linear PI controllers. Two classical control strategies are tested (local upstream control and distant downstream control) and compared on the canal. The experimental results show the effectiveness of the model.