970 resultados para Location-Allocation Models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper aims at developing a collision prediction model for three-leg junctions located in national roads (NR) in Northern Portugal. The focus is to identify factors that contribute for collision type crashes in those locations, mainly factors related to road geometric consistency, since literature is scarce on those, and to research the impact of three modeling methods: generalized estimating equations, random-effects negative binomial models and random-parameters negative binomial models, on the factors of those models. The database used included data published between 2008 and 2010 of 177 three-leg junctions. It was split in three groups of contributing factors which were tested sequentially for each of the adopted models: at first only traffic, then, traffic and the geometric characteristics of the junctions within their area of influence; and, lastly, factors which show the difference between the geometric characteristics of the segments boarding the junctionsâ area of influence and the segment included in that area were added. The choice of the best modeling technique was supported by the result of a cross validation made to ascertain the best model for the three sets of researched contributing factors. The models fitted with random-parameters negative binomial models had the best performance in the process. In the best models obtained for every modeling technique, the characteristics of the road environment, including proxy measures for the geometric consistency, along with traffic volume, contribute significantly to the number of collisions. Both the variables concerning junctions and the various national highway segments in their area of influence, as well as variations from those characteristics concerning roadway segments which border the already mentioned area of influence have proven their relevance and, therefore, there is a rightful need to incorporate the effect of geometric consistency in the three-leg junctions safety studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article describes the main approaches adopted in a study focused on planning industrial estates on a sub-regional scale. The study was supported by an agent-based model, using firms as agents to assess the attractiveness of industrial estates. The simulation was made by the NetLogo toolkit and the environment represents a geographical space. Three scenarios and four hypotheses were used in the simulation to test the impact of different policies on the attractiveness of industrial estates. Policies were distinguished by the level of municipal coordination at which they were implemented and by the type of intervention. In the model, the attractiveness of industrial estates was based on the level of facilities, amenities, accessibility and on the price of land in each industrial estate. Firms are able to move and relocate whenever they find an attractive estate. The relocating firms were selected by their size, location and distance to an industrial estate. Results show that a coordinated policy among municipalities is the most efficient policy to promote advanced-qualified estates. In these scenarios, it was observed that more industrial estates became attractive, more firms were relocated and more vacant lots were occupied. Furthermore, the results also indicate that the promotion of widespread industrial estates with poor-quality infrastructures and amenities is an inefficient policy to attract firms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

telligence applications for the banking industry. Searches were performed in relevant journals resulting in 219 articles published between 2002 and 2013. To analyze such a large number of manuscripts, text mining techniques were used in pursuit for relevant terms on both business intelligence and banking domains. Moreover, the latent Dirichlet allocation modeling was used in or- der to group articles in several relevant topics. The analysis was conducted using a dictionary of terms belonging to both banking and business intelli- gence domains. Such procedure allowed for the identification of relationships between terms and topics grouping articles, enabling to emerge hypotheses regarding research directions. To confirm such hypotheses, relevant articles were collected and scrutinized, allowing to validate the text mining proce- dure. The results show that credit in banking is clearly the main application trend, particularly predicting risk and thus supporting credit approval or de- nial. There is also a relevant interest in bankruptcy and fraud prediction. Customer retention seems to be associated, although weakly, with targeting, justifying bank offers to reduce churn. In addition, a large number of ar- ticles focused more on business intelligence techniques and its applications, using the banking industry just for evaluation, thus, not clearly acclaiming for benefits in the banking business. By identifying these current research topics, this study also highlights opportunities for future research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Earthworks tasks aim at levelling the ground surface at a target construction area and precede any kind of structural construction (e.g., road and railway construction). It is comprised of sequential tasks, such as excavation, transportation, spreading and compaction, and it is strongly based on heavy mechanical equipment and repetitive processes. Under this context, it is essential to optimize the usage of all available resources under two key criteria: the costs and duration of earthwork projects. In this paper, we present an integrated system that uses two artificial intelligence based techniques: data mining and evolutionary multi-objective optimization. The former is used to build data-driven models capable of providing realistic estimates of resource productivity, while the latter is used to optimize resource allocation considering the two main earthwork objectives (duration and cost). Experiments held using real-world data, from a construction site, have shown that the proposed system is competitive when compared with current manual earthwork design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a proposal for a management model based on reliability requirements concerning Cloud Computing (CC). The proposal was based on a literature review focused on the problems, challenges and underway studies related to the safety and reliability of Information Systems (IS) in this technological environment. This literature review examined the existing obstacles and challenges from the point of view of respected authors on the subject. The main issues are addressed and structured as a model, called "Trust Model for Cloud Computing environment". This is a proactive proposal that purposes to organize and discuss management solutions for the CC environment, aiming improved reliability of the IS applications operation, for both providers and their customers. On the other hand and central to trust, one of the CC challenges is the development of models for mutual audit management agreements, so that a formal relationship can be established involving the relevant legal responsibilities. To establish and control the appropriate contractual requirements, it is necessary to adopt technologies that can collect the data needed to inform risk decisions, such as access usage, security controls, location and other references related to the use of the service. In this process, the cloud service providers and consumers themselves must have metrics and controls to support cloud-use management in compliance with the SLAs agreed between the parties. The organization of these studies and its dissemination in the market as a conceptual model that is able to establish parameters to regulate a reliable relation between provider and user of IT services in CC environment is an interesting instrument to guide providers, developers and users in order to provide services and secure and reliable applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Archeology and related areas have a special interest on cultural heritage sites since they provide valuable information about past civilizations. However, the ancient buildings present in these sites are commonly found in an advanced state of degradation which difficult the professional/expert analysis. Virtual reconstructions of such buildings aim to provide a digital insight of how these historical places could have been in ancient times. Moreover, the visualization of such models has been explored by some Augmented Reality (AR) systems capable of providing support to experts. Their compelling and appealing environments have also been applied to promote the social and cultural participation of general public. The existing AR solutions regarding this thematic rarely explore the potential of realism, due to the following lacks: the exploration of mixed environments is usually only supported for indoors or outdoors, not both in the same system; the adaptation of the illumination conditions to the reconstructed structures is rarely addressed causing a decrease of credibility. MixAR [1] is a system concerned with those challenges, aiming to provide the visualization of virtual buildings augmented upon real ruins, allowing soft transitions among its interiors and exteriors and using relighting techniques for a faithful interior illumination, while the user freely moves in a given cultural heritage site, carrying a mobile unit. Regarding the focus of this paper, we intend to report the current state of MixAR mobile unit prototype, which allows visualizing virtual buildings – properly aligned with real-world structures – based on user's location, during outdoor navigation. In order to evaluate the prototype performance, a set of tests were made using virtual models with different complexities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Developing and implementing data-oriented workflows for data migration processes are complex tasks involving several problems related to the integration of data coming from different schemas. Usually, they involve very specific requirements - every process is almost unique. Having a way to abstract their representation will help us to better understand and validate them with business users, which is a crucial step for requirements validation. In this demo we present an approach that provides a way to enrich incrementally conceptual models in order to support an automatic way for producing their correspondent physical implementation. In this demo we will show how B2K (Business to Kettle) system works transforming BPMN 2.0 conceptual models into Kettle data-integration executable processes, approaching the most relevant aspects related to model design and enrichment, model to system transformation, and system execution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ETL conceptual modeling is a very important activity in any data warehousing system project implementation. Owning a high-level system representation allowing for a clear identification of the main parts of a data warehousing system is clearly a great advantage, especially in early stages of design and development. However, the effort to model conceptually an ETL system rarely is properly rewarded. Translating ETL conceptual models directly into something that saves work and time on the concrete implementation of the system process it would be, in fact, a great help. In this paper we present and discuss a hybrid approach to this problem, combining the simplicity of interpretation and power of expression of BPMN on ETL systems conceptualization with the use of ETL patterns to produce automatically an ETL skeleton, a first prototype system, which has the ability to be executed in a commercial ETL tool like Kettle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work provides analytical and numerical solutions for the linear, quadratic and exponential Phan–Thien–Tanner (PTT) viscoelastic models, for axial and helical annular fully-developed flows under no slip and slip boundary conditions, the latter given by the linear and nonlinear Navier slip laws. The rheology of the three PTT model functions is discussed together with the influence of the slip velocity upon the flow velocity and stress fields. For the linear PTT model, full analytical solutions for the inverse problem (unknown velocity) are devised for the linear Navier slip law and two different slip exponents. For the linear PTT model with other values of the slip exponent and for the quadratic PTT model, the polynomial equation for the radial location (β) of the null shear stress must be solved numerically. For both models, the solution of the direct problem is given by an iterative procedure involving three nonlinear equations, one for β, other for the pressure gradient and another for the torque per unit length. For the exponential PTT model we devise a numerical procedure that can easily compute the numerical solution of the pure axial flow problem

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work reports the implementation and verification of a new so lver in OpenFOAM® open source computational library, able to cope with integral viscoelastic models based on the integral upper-convected Maxwell model. The code is verified through the comparison of its predictions with analytical solutions and numerical results obtained with the differential upper-convected Maxwell model

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, we concentrate on modelling gross primary productivity using two simple approaches to simulate canopy photosynthesis: "big leaf" and "sun/shade" models. Two approaches for calibration are used: scaling up of canopy photosynthetic parameters from the leaf to the canopy level and fitting canopy biochemistry to eddy covariance fluxes. Validation of the models is achieved by using eddy covariance data from the LBA site C14. Comparing the performance of both models we conclude that numerically (in terms of goodness of fit) and qualitatively, (in terms of residual response to different environmental variables) sun/shade does a better job. Compared to the sun/shade model, the big leaf model shows a lower goodness of fit and fails to respond to variations in the diffuse fraction, also having skewed responses to temperature and VPD. The separate treatment of sun and shade leaves in combination with the separation of the incoming light into direct beam and diffuse make sun/shade a strong modelling tool that catches more of the observed variability in canopy fluxes as measured by eddy covariance. In conclusion, the sun/shade approach is a relatively simple and effective tool for modelling photosynthetic carbon uptake that could be easily included in many terrestrial carbon models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese de Doutoramento em Ciências da Administração

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This review deals with the recent developments and present status of the theoretical models for the simulation of the performance of lithium ion batteries. Preceded by a description of the main materials used for each of the components of a battery -anode, cathode and separator- and how material characteristics affect battery performance, a description of the main theoretical models describing the operation and performance of a battery are presented. The influence of the most relevant parameters of the models, such as boundary conditions, geometry and material characteristics are discussed. Finally, suggestions for future work are proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Programa Doutoral em Engenharia Industrial e de Sistemas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Project Management involves onetime endeavors that demand for getting it right the first time. On the other hand, project scheduling, being one of the most modeled project management process stages, still faces a wide gap from theory to practice. Demanding computational models and their consequent call for simplification, divert the implementation of such models in project management tools from the actual day to day project management process. Special focus is being made to the robustness of the generated project schedules facing the omnipresence of uncertainty. An "easy" way out is to add, more or less cleverly calculated, time buffers that always result in project duration increase and correspondingly, in cost. A better approach to deal with uncertainty seems to be to explore slack that might be present in a given project schedule, a fortiori when a non-optimal schedule is used. The combination of such approach to recent advances in modeling resource allocation and scheduling techniques to cope with the increasing flexibility in resources, as can be expressed in "Flexible Resource Constraint Project Scheduling Problem" (FRCPSP) formulations, should be a promising line of research to generate more adequate project management tools. In reality, this approach has been frequently used, by project managers in an ad-hoc way.