953 resultados para model complexity
Resumo:
During the last years cities around the world have invested important quantities of money in measures for reducing congestion and car-trips. Investments which are nothing but potential solutions for the well-known urban sprawl phenomenon, also called the “development trap” that leads to further congestion and a higher proportion of our time spent in slow moving cars. Over the path of this searching for solutions, the complex relationship between urban environment and travel behaviour has been studied in a number of cases. The main question on discussion is, how to encourage multi-stop tours? Thus, the objective of this paper is to verify whether unobserved factors influence tour complexity. For this purpose, we use a data-base from a survey conducted in 2006-2007 in Madrid, a suitable case study for analyzing urban sprawl due to new urban developments and substantial changes in mobility patterns in the last years. A total of 943 individuals were interviewed from 3 selected neighbourhoods (CBD, urban and suburban). We study the effect of unobserved factors on trip frequency. This paper present the estimation of an hybrid model where the latent variable is called propensity to travel and the discrete choice model is composed by 5 alternatives of tour type. The results show that characteristics of the neighbourhoods in Madrid are important to explain trip frequency. The influence of land use variables on trip generation is clear and in particular the presence of commercial retails. Through estimation of elasticities and forecasting we determine to what extent land-use policy measures modify travel demand. Comparing aggregate elasticities with percentage variations, it can be seen that percentage variations could lead to inconsistent results. The result shows that hybrid models better explain travel behavior than traditional discrete choice models.
Resumo:
Office automation is one of the fields where the complexity related with technologies and working environments can be best shown. This is the starting point we have chosen to build up a theoretical model that shows us a scene quite different from the one traditionally considered. Through the development of the model, the levels of complexity associated with office automation and office environments have been identified, establishing a relationship between them. Thus, the model allows to state a general principle for sociotechnical design of office automation systems, comprising the ontological distinctions needed to properly evaluate each particular technology and its virtual contribution to office automation. From this fact comes the model's taxonomic ability to draw a global perspective of the state-of-art in office automation technologies.
Resumo:
PURPOSE The decision-making process plays a key role in organizations. Every decision-making process produces a final choice that may or may not prompt action. Recurrently, decision makers find themselves in the dichotomous question of following a traditional sequence decision-making process where the output of a decision is used as the input of the next stage of the decision, or following a joint decision-making approach where several decisions are taken simultaneously. The implication of the decision-making process will impact different players of the organization. The choice of the decision- making approach becomes difficult to find, even with the current literature and practitioners’ knowledge. The pursuit of better ways for making decisions has been a common goal for academics and practitioners. Management scientists use different techniques and approaches to improve different types of decisions. The purpose of this decision is to use the available resources as well as possible (data and techniques) to achieve the objectives of the organization. The developing and applying of models and concepts may be helpful to solve managerial problems faced every day in different companies. As a result of this research different decision models are presented to contribute to the body of knowledge of management science. The first models are focused on the manufacturing industry and the second part of the models on the health care industry. Despite these models being case specific, they serve the purpose of exemplifying that different approaches to the problems and could provide interesting results. Unfortunately, there is no universal recipe that could be applied to all the problems. Furthermore, the same model could deliver good results with certain data and bad results for other data. A framework to analyse the data before selecting the model to be used is presented and tested in the models developed to exemplify the ideas. METHODOLOGY As the first step of the research a systematic literature review on the joint decision is presented, as are the different opinions and suggestions of different scholars. For the next stage of the thesis, the decision-making process of more than 50 companies was analysed in companies from different sectors in the production planning area at the Job Shop level. The data was obtained using surveys and face-to-face interviews. The following part of the research into the decision-making process was held in two application fields that are highly relevant for our society; manufacturing and health care. The first step was to study the interactions and develop a mathematical model for the replenishment of the car assembly where the problem of “Vehicle routing problem and Inventory” were combined. The next step was to add the scheduling or car production (car sequencing) decision and use some metaheuristics such as ant colony and genetic algorithms to measure if the behaviour is kept up with different case size problems. A similar approach is presented in a production of semiconductors and aviation parts, where a hoist has to change from one station to another to deal with the work, and a jobs schedule has to be done. However, for this problem simulation was used for experimentation. In parallel, the scheduling of operating rooms was studied. Surgeries were allocated to surgeons and the scheduling of operating rooms was analysed. The first part of the research was done in a Teaching hospital, and for the second part the interaction of uncertainty was added. Once the previous problem had been analysed a general framework to characterize the instance was built. In the final chapter a general conclusion is presented. FINDINGS AND PRACTICAL IMPLICATIONS The first part of the contributions is an update of the decision-making literature review. Also an analysis of the possible savings resulting from a change in the decision process is made. Then, the results of the survey, which present a lack of consistency between what the managers believe and the reality of the integration of their decisions. In the next stage of the thesis, a contribution to the body of knowledge of the operation research, with the joint solution of the replenishment, sequencing and inventory problem in the assembly line is made, together with a parallel work with the operating rooms scheduling where different solutions approaches are presented. In addition to the contribution of the solving methods, with the use of different techniques, the main contribution is the framework that is proposed to pre-evaluate the problem before thinking of the techniques to solve it. However, there is no straightforward answer as to whether it is better to have joint or sequential solutions. Following the proposed framework with the evaluation of factors such as the flexibility of the answer, the number of actors, and the tightness of the data, give us important hints as to the most suitable direction to take to tackle the problem. RESEARCH LIMITATIONS AND AVENUES FOR FUTURE RESEARCH In the first part of the work it was really complicated to calculate the possible savings of different projects, since in many papers these quantities are not reported or the impact is based on non-quantifiable benefits. The other issue is the confidentiality of many projects where the data cannot be presented. For the car assembly line problem more computational power would allow us to solve bigger instances. For the operation research problem there was a lack of historical data to perform a parallel analysis in the teaching hospital. In order to keep testing the decision framework it is necessary to keep applying more case studies in order to generalize the results and make them more evident and less ambiguous. The health care field offers great opportunities since despite the recent awareness of the need to improve the decision-making process there are many opportunities to improve. Another big difference with the automotive industry is that the last improvements are not spread among all the actors. Therefore, in the future this research will focus more on the collaboration between academia and the health care sector.
Resumo:
The spatial complexity of the distribution of organic matter, chemicals, nutrients, pollutants has been demonstrated to have multifractal nature (Kravchenco et al. [1]). This fact supports the possibility of existence of some emergent heterogeneity structure built under the evolution of the system. The aim of this note is providing a consistent explanation to the mentioned results via an extremely simple model.
Resumo:
Abrupt climate changes from 18 to 15 thousand years before present (kyr BP) associated with Heinrich Event 1 (HE1) had a strong impact on vegetation patterns not only at high latitudes of the Northern Hemisphere, but also in the tropical regions around the Atlantic Ocean. To gain a better understanding of the linkage between high and low latitudes, we used the University of Victoria (UVic) Earth System-Climate Model (ESCM) with dynamical vegetation and land surface components to simulate four scenarios of climate-vegetation interaction: the pre-industrial era, the Last Glacial Maximum (LGM), and a Heinrich-like event with two different climate backgrounds (interglacial and glacial). We calculated mega-biomes from the plant-functional types (PFTs) generated by the model to allow for a direct comparison between model results and palynological vegetation reconstructions. Our calculated mega-biomes for the pre-industrial period and the LGM corresponded well with biome reconstructions of the modern and LGM time slices, respectively, except that our pre-industrial simulation predicted the dominance of grassland in southern Europe and our LGM simulation resulted in more forest cover in tropical and sub-tropical South America. The HE1-like simulation with a glacial climate background produced sea-surface temperature patterns and enhanced inter-hemispheric thermal gradients in accordance with the "bipolar seesaw" hypothesis. We found that the cooling of the Northern Hemisphere caused a southward shift of those PFTs that are indicative of an increased desertification and a retreat of broadleaf forests in West Africa and northern South America. The mega-biomes from our HE1 simulation agreed well with paleovegetation data from tropical Africa and northern South America. Thus, according to our model-data comparison, the reconstructed vegetation changes for the tropical regions around the Atlantic Ocean were physically consistent with the remote effects of a Heinrich event under a glacial climate background.
Resumo:
This article develops a relational model of institutional work and complexity. This model advances current institutional debates on institutional complexity and institutional work in three ways. First, it provides a relational and dynamic perspective on institutional complexity by explaining how constellations of logics - and their degree of internal contradiction - are constructed rather than given. Second, it refines our current understanding of agency, intentionality and effort in institutional work by demonstrating how different dimensions of agency interact dynamically in the institutional work of reconstructing institutional complexity. Third, it situates institutional work in the everyday practice of individuals coping with the institutional complexities of their work. In doing so, it reconnects the construction of institutionally complex settings to the actions and interactions of the individuals who inhabit them. © The Author(s) 2013.
Resumo:
Intermediate-complexity general circulation models are a fundamental tool to investigate the role of internal and external variability within the general circulation of the atmosphere and ocean. The model used in this thesis is an intermediate complexity atmospheric general circulation model (SPEEDY) coupled to a state-of-the-art modelling framework for the ocean (NEMO). We assess to which extent the model allows a realistic simulation of the most prominent natural mode of variability at interannual time scales: El-Niño Southern Oscillation (ENSO). To a good approximation, the model represents the ENSO-induced Sea Surface Temperature (SST) pattern in the equatorial Pacific, despite a cold tongue-like bias. The model underestimates (overestimates) the typical ENSO spatial variability during the winter (summer) seasons. The mid-latitude response to ENSO reveals that the typical poleward stationary Rossby wave train is reasonably well represented. The spectral decomposition of ENSO features a spectrum that lacks periodicity at high frequencies and is overly periodic at interannual timescales. We then implemented an idealised transient mean state change in the SPEEDY model. A warmer climate is simulated by an alteration of the parametrized radiative fluxes that corresponds to doubled carbon dioxide absorptivity. Results indicate that the globally averaged surface air temperature increases of 0.76 K. Regionally, the induced signal on the SST field features a significant warming over the central-western Pacific and an El-Niño-like warming in the subtropics. In general, the model features a weakening of the tropical Walker circulation and a poleward expansion of the local Hadley cell. This response is also detected in a poleward rearrangement of the tropical convective rainfall pattern. The model setting that has been here implemented provides a valid theoretical support for future studies on climate sensitivity and forced modes of variability under mean state changes.
Resumo:
In questo lavoro di tesi viene presentato e validato un modello di rischio di alluvione a complessità intermedia per scenari climatici futuri. Questo modello appartiene a quella categoria di strumenti che mirano a soddisfare le esigenze identificate dal World Climate Research Program (WRCP) per affrontare gli effetti del cambiamento climatico. L'obiettivo perseguito è quello di sviluppare, seguendo un approccio ``bottom-up" al rischio climatico regionale, strumenti che possano aiutare i decisori a realizzare l'adattamento ai cambiamenti climatici. Il modello qui presentato è interamente basato su dati open-source forniti dai servizi Copernicus. Il contributo di questo lavoro di tesi riguarda lo sviluppo di un modello, formulato da (Ruggieri et al.), per stimare i danni di eventi alluvionali fluviali per specifici i livelli di riscaldamento globale (GWL). Il modello è stato testato su tre bacini idrografici di medie dimensioni in Emilia-Romagna, Panaro, Reno e Secchia. In questo lavoro, il modello viene sottoposto a test di sensibilità rispetto a un'ipotesi enunciata nella formulazione del modello, poi vengono effettuate analisi relative all'ensemble multi-modello utilizzato per le proiezioni. Il modello viene quindi validato, confrontando i danni stimati nel clima attuale per i tre fiumi con i danni osservati e confrontando le portate simulate con quelle osservate. Infine, vengono stimati i danni associati agli eventi alluvionali in tre scenari climatici futuri caratterizzati da GWL di 1.5° C, 2.0° C e 3.0°C.
Resumo:
OBJECTIVES: The complexity and heterogeneity of human bone, as well as ethical issues, frequently hinder the development of clinical trials. The purpose of this in vitro study was to determine the modulus of elasticity of a polyurethane isotropic experimental model via tension tests, comparing the results to those reported in the literature for mandibular bone, in order to validate the use of such a model in lieu of mandibular bone in biomechanical studies. MATERIAL AND METHODS: Forty-five polyurethane test specimens were divided into 3 groups of 15 specimens each, according to the ratio (A/B) of polyurethane reagents (PU-1: 1/0.5, PU-2: 1/1, PU-3: 1/1.5). RESULTS: Tension tests were performed in each experimental group and the modulus of elasticity values found were 192.98 MPa (SD=57.20) for PU-1, 347.90 MPa (SD=109.54) for PU-2 and 304.64 MPa (SD=25.48) for PU-3. CONCLUSION: The concentration of choice for building the experimental model was 1/1.
Resumo:
OBJECTIVES: The complexity and heterogeneity of human bone, as well as ethical issues, most always hinder the performance of clinical trials. Thus, in vitro studies become an important source of information for the understanding of biomechanical events on implant-supported prostheses, although study results cannot be considered reliable unless validation studies are conducted. The purpose of this work was to validate an artificial experimental model based on its modulus of elasticity, to simulate the performance of human bone in vivo in biomechanical studies of implant-supported prostheses. MATERIAL AND METHODS: In this study, fast-curing polyurethane (F16 polyurethane, Axson) was used to build 40 specimens that were divided into five groups. The following reagent ratios (part A/part B) were used: Group A (0.5/1.0), Group B (0.8/1.0), Group C (1.0/1.0), Group D (1.2/1.0), and Group E (1.5/1.0). A universal testing machine (Kratos model K - 2000 MP) was used to measure modulus of elasticity values by compression. RESULTS: Mean modulus of elasticity values were: Group A - 389.72 MPa, Group B - 529.19 MPa, Group C - 571.11 MPa, Group D - 470.35 MPa, Group E - 437.36 MPa. CONCLUSION: The best mechanical characteristics and modulus of elasticity value comparable to that of human trabecular bone were obtained when A/B ratio was 1:1.
Resumo:
In this paper the continuous Verhulst dynamic model is used to synthesize a new distributed power control algorithm (DPCA) for use in direct sequence code division multiple access (DS-CDMA) systems. The Verhulst model was initially designed to describe the population growth of biological species under food and physical space restrictions. The discretization of the corresponding differential equation is accomplished via the Euler numeric integration (ENI) method. Analytical convergence conditions for the proposed DPCA are also established. Several properties of the proposed recursive algorithm, such as Euclidean distance from optimum vector after convergence, convergence speed, normalized mean squared error (NSE), average power consumption per user, performance under dynamics channels, and implementation complexity aspects, are analyzed through simulations. The simulation results are compared with two other DPCAs: the classic algorithm derived by Foschini and Miljanic and the sigmoidal of Uykan and Koivo. Under estimated errors conditions, the proposed DPCA exhibits smaller discrepancy from the optimum power vector solution and better convergence (under fixed and adaptive convergence factor) than the classic and sigmoidal DPCAs. (C) 2010 Elsevier GmbH. All rights reserved.
Resumo:
A new excitation model for the numerical solution of field integral equation (EFIE) applied to arbitrarily shaped monopole antennas fed by coaxial lines is presented. This model yields a stable solution for the input impedance of such antennas with very low numerical complexity and without the convergence and high parasitic capacitance problems associated with the usual delta gap excitation.
Resumo:
This essay is a trial on giving some mathematical ideas about the concept of biological complexity, trying to explore four different attributes considered to be essential to characterize a complex system in a biological context: decomposition, heterogeneous assembly, self-organization, and adequacy. It is a theoretical and speculative approach, opening some possibilities to further numerical and experimental work, illustrated by references to several researches that applied the concepts presented here. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
This essay is a trial on measuring complexity in a three-trophic level system by using a convex function of the informational entropy. The complexity measure defined here is compatible with the fact that real complexity lies between ordered and disordered states. Applying this measure to the data collected for two three-trophic level systems some hints about their organization are obtained. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Overcommitment of development capacity or development resource deficiencies are important problems in new product development (NPD). Existing approaches to development resource planning have largely neglected the issue of resource magnitude required for NPD. This research aims to fill the void by developing a simple higher-level aggregate model based on an intuitive idea: The number of new product families that a firm can effectively undertake is bound by the complexity of its products or systems and the total amount of resources allocated to NPD. This study examines three manufacturing companies to verify the proposed model. The empirical results confirm the study`s initial hypothesis: The more complex the product family, the smaller the number of product families that are launched per unit of revenue. Several suggestions and implications for managing NPD resources are discussed, such as how this study`s model can establish an upper limit for the capacity to develop and launch new product families.