987 resultados para Conceptual modelling
Resumo:
The Drivers Scheduling Problem (DSP) consists of selecting a set of duties for vehicle drivers, for example buses, trains, plane or boat drivers or pilots, for the transportation of passengers or goods. This is a complex problem because it involves several constraints related to labour and company rules and can also present different evaluation criteria and objectives. Being able to develop an adequate model for this problem that can represent the real problem as close as possible is an important research area.The main objective of this research work is to present new mathematical models to the DSP problem that represent all the complexity of the drivers scheduling problem, and also demonstrate that the solutions of these models can be easily implemented in real situations. This issue has been recognized by several authors and as important problem in Public Transportation. The most well-known and general formulation for the DSP is a Set Partition/Set Covering Model (SPP/SCP). However, to a large extend these models simplify some of the specific business aspects and issues of real problems. This makes it difficult to use these models as automatic planning systems because the schedules obtained must be modified manually to be implemented in real situations. Based on extensive passenger transportation experience in bus companies in Portugal, we propose new alternative models to formulate the DSP problem. These models are also based on Set Partitioning/Covering Models; however, they take into account the bus operator issues and the perspective opinions and environment of the user.We follow the steps of the Operations Research Methodology which consist of: Identify the Problem; Understand the System; Formulate a Mathematical Model; Verify the Model; Select the Best Alternative; Present the Results of theAnalysis and Implement and Evaluate. All the processes are done with close participation and involvement of the final users from different transportation companies. The planner s opinion and main criticisms are used to improve the proposed model in a continuous enrichment process. The final objective is to have a model that can be incorporated into an information system to be used as an automatic tool to produce driver schedules. Therefore, the criteria for evaluating the models is the capacity to generate real and useful schedules that can be implemented without many manual adjustments or modifications. We have considered the following as measures of the quality of the model: simplicity, solution quality and applicability. We tested the alternative models with a set of real data obtained from several different transportation companies and analyzed the optimal schedules obtained with respect to the applicability of the solution to the real situation. To do this, the schedules were analyzed by the planners to determine their quality and applicability. The main result of this work is the proposition of new mathematical models for the DSP that better represent the realities of the passenger transportation operators and lead to better schedules that can be implemented directly in real situations.
Resumo:
This paper evaluates new evidence on price setting practices and inflation persistence in the euro area with respect to its implications for macro modelling. It argues that several of the most commonly used assumptions in micro-founded macro models are seriously challenged by the new findings.
3D seismic facies characterization and geological patterns recognition (Australian North West Shelf)
Resumo:
EXECUTIVE SUMMARY This PhD research, funded by the Swiss Sciences Foundation, is principally devoted to enhance the recognition, the visualisation and the characterization of geobodies through innovative 3D seismic approaches. A series of case studies from the Australian North West Shelf ensures the development of reproducible integrated 3D workflows and gives new insight into local and regional stratigraphic as well as structural issues. This project was initiated in year 2000 at the Geology and Palaeontology Institute of the University of Lausanne (Switzerland). Several collaborations ensured the improvement of technical approaches as well as the assessment of geological models. - Investigations into the Timor Sea structural style were carried out at the Tectonics Special Research Centre of the University of Western Australia and in collaboration with Woodside Energy in Perth. - Seismic analysis and attributes classification approach were initiated with Schlumberger Oilfield Australia in Perth; assessments and enhancements of the integrated seismic approaches benefited from collaborations with scientists from Schlumberger Stavanger Research (Norway). Adapting and refining from "linear" exploration techniques, a conceptual "helical" 3D seismic approach has been developed. In order to investigate specific geological issues this approach, integrating seismic attributes and visualisation tools, has been refined and adjusted leading to the development of two specific workflows: - A stratigraphic workflow focused on the recognition of geobodies and the characterization of depositional systems. Additionally, it can support the modelling of the subsidence and incidentally the constraint of the hydrocarbon maturity of a given area. - A structural workflow used to quickly and accurately define major and secondary fault systems. The integration of the 3D structural interpretation results ensures the analysis of the fault networks kinematics which can affect hydrocarbon trapping mechanisms. The application of these integrated workflows brings new insight into two complex settings on the Australian North West Shelf and ensures the definition of astonishing stratigraphic and structural outcomes. The stratigraphic workflow ensures the 3D characterization of the Late Palaeozoic glacial depositional system on the Mermaid Nose (Dampier Subbasin, Northern Carnarvon Basin) that presents similarities with the glacial facies along the Neotethys margin up to Oman (chapter 3.1). A subsidence model reveals the Phanerozoic geodynamic evolution of this area (chapter 3.2) and emphasizes two distinct mode of regional extension for the Palaeozoic (Neotethys opening) and Mesozoic (abyssal plains opening). The structural workflow is used for the definition of the structural evolution of the Laminaria High area (Bonaparte Basin). Following a regional structural characterization of the Timor Sea (chapter 4.1), a thorough analysis of the Mesozoic fault architecture reveals a local rotation of the stress field and the development of reverse structures (flower structures) in extensional setting, that form potential hydrocarbon traps (chapter 4.2). The definition of the complex Neogene structural architecture associated with the fault kinematic analysis and a plate flexure model (chapter 4.3) suggest that the Miocene to Pleistocene reactivation phases recorded at the Laminaria High most probably result from the oblique normal reactivation of the underlying Mesozoic fault planes. This episode is associated with the deformation of the subducting Australian plate. Based on these results three papers were published in international journals and two additional publications will be submitted. Additionally this research led to several communications in international conferences. Although the different workflows presented in this research have been primarily developed and used for the analysis of specific stratigraphic and structural geobodies on the Australian North West Shelf, similar integrated 3D seismic approaches will have applications to hydrocarbon exploration and production phases; for instance increasing the recognition of potential source rocks, secondary migration pathways, additional traps or reservoir breaching mechanisms. The new elements brought by this research further highlight that 3D seismic data contains a tremendous amount of hidden geological information waiting to be revealed and that will undoubtedly bring new insight into depositional systems, structural evolution and geohistory of the areas reputed being explored and constrained and other yet to be constrained. The further development of 3D texture attributes highlighting specific features of the seismic signal, the integration of quantitative analysis for stratigraphic and structural processes, the automation of the interpretation workflow as well as the formal definition of "seismo-morphologic" characteristics of a wide range of geobodies from various environments would represent challenging examples of continuation of this present research. The 21st century will most probably represent a transition period between fossil and other alternative energies. The next generation of seismic interpreters prospecting for hydrocarbon will undoubtedly face new challenges mostly due to the shortage of obvious and easy targets. They will probably have to keep on integrating techniques and geological processes in order to further capitalise the seismic data for new potentials definition. Imagination and creativity will most certainly be among the most important quality required from such geoscientists.
Resumo:
Youth is one of the phases in the life-cycle when some of the most decisivelife transitions take place. Entering the labour market or leaving parentalhome are events with important consequences for the economic well-beingof young adults. In this paper, the interrelationship between employment,residential emancipation and poverty dynamics is studied for eight Europeancountries by means of an econometric model with feedback effects. Resultsshow that youth poverty genuine state dependence is positive and highly significant.Evidence proves there is a strong causal effect between poverty andleaving home in Scandinavian countries, however, time in economic hardshipdoes not last long. In Southern Europe, instead, youth tend to leave theirparental home much later in order to avoid falling into a poverty state that ismore persistent. Past poverty has negative consequences on the likelihood ofemployment.
Resumo:
Despite the importance of supplier inducement and brand loyalty inthe drug purchasing process, little empirical evidence is to be foundwith regard to the influence that these factors exert on patients decisions. Under the new scenario of easier access to information,patients are becoming more demanding and even go as far asquestioning their physicians prescription. Furthermore, newregulation also encourages patients to adopt an active role in thedecision between brand-name and generic drugs. Using a statedpreference model based on a choice survey, I have found evidenceof how significant physicians prescription and pharmacists recommendation become throughout the drug purchase process and,to what extent, brand loyalty influences the final decision. Asfar as we are aware, this paper is the first to explicitlytake consumers preferences into account rather than focusingon the behavior of health professionals.
Resumo:
Geobiota are defined by taxic assemblages (i.e., biota) and their defining abiotic breaks, which are mapped in cross-section to reveal past and future biotic boundaries. We term this conceptual approach Temporal Geobiotic Mapping (TGM) and offer it as a conceptual approach for biogeography. TGM is based on geological cross-sectioning, which creates maps based on the distribution of biota and known abiotic factors that drive their distribution, such as climate, topography, soil chemistry and underlying geology. However, the availability of abiotic data is limited for many areas. Unlike other approaches, TGM can be used when there is minimal data available. In order to demonstrate TGM, we use the well-known area in the Blue Mountains, New South Wales (NSW), south-eastern Australia and show how surface processes such as weathering and erosion affect the future distribution of a Moist Basalt Forest taxic assemblage. Biotic areas are best represented visually as maps, which can show transgressions and regressions of biota and abiota over time. Using such maps, a biogeographer can directly compare animal and plant distributions with features in the abiotic environment and may identify significant geographical barriers or pathways that explain biotic distributions.
Resumo:
Context There are no evidence syntheses available to guide clinicians on when to titrate antihypertensive medication after initiation. Objective To model the blood pressure (BP) response after initiating antihypertensive medication. Data sources electronic databases including Medline, Embase, Cochrane Register and reference lists up to December 2009. Study selection Trials that initiated antihypertensive medication as single therapy in hypertensive patients who were either drug naive or had a placebo washout from previous drugs. Data extraction Office BP measurements at a minimum of two weekly intervals for a minimum of 4 weeks. An asymptotic approach model of BP response was assumed and non-linear mixed effects modelling used to calculate model parameters. Results and conclusions Eighteen trials that recruited 4168 patients met inclusion criteria. The time to reach 50% of the maximum estimated BP lowering effect was 1 week (systolic 0.91 weeks, 95% CI 0.74 to 1.10; diastolic 0.95, 0.75 to 1.15). Models incorporating drug class as a source of variability did not improve fit of the data. Incorporating the presence of a titration schedule improved model fit for both systolic and diastolic pressure. Titration increased both the predicted maximum effect and the time taken to reach 50% of the maximum (systolic 1.2 vs 0.7 weeks; diastolic 1.4 vs 0.7 weeks). Conclusions Estimates of the maximum efficacy of antihypertensive agents can be made early after starting therapy. This knowledge will guide clinicians in deciding when a newly started antihypertensive agent is likely to be effective or not at controlling BP.
Resumo:
Designing an efficient sampling strategy is of crucial importance for habitat suitability modelling. This paper compares four such strategies, namely, 'random', 'regular', 'proportional-stratified' and 'equal -stratified'- to investigate (1) how they affect prediction accuracy and (2) how sensitive they are to sample size. In order to compare them, a virtual species approach (Ecol. Model. 145 (2001) 111) in a real landscape, based on reliable data, was chosen. The distribution of the virtual species was sampled 300 times using each of the four strategies in four sample sizes. The sampled data were then fed into a GLM to make two types of prediction: (1) habitat suitability and (2) presence/ absence. Comparing the predictions to the known distribution of the virtual species allows model accuracy to be assessed. Habitat suitability predictions were assessed by Pearson's correlation coefficient and presence/absence predictions by Cohen's K agreement coefficient. The results show the 'regular' and 'equal-stratified' sampling strategies to be the most accurate and most robust. We propose the following characteristics to improve sample design: (1) increase sample size, (2) prefer systematic to random sampling and (3) include environmental information in the design'
Resumo:
Doctoral dissertation, University of Jyväskylä
Resumo:
Although functional neuroimaging studies have supported the distinction between explicit and implicit forms of memory, few have matched explicit and implicit tests closely, and most of these tested perceptual rather than conceptual implicit memory. We compared event-related fMRI responses during an intentional test, in which a group of participants used a cue word to recall its associate from a prior study phase, with those in an incidental test, in which a different group of participants used the same cue to produce the first associate that came to mind. Both semantic relative to phonemic processing at study, and emotional relative to neutral word pairs, increased target completions in the intentional test, but not in the incidental test, suggesting that behavioral performance in the incidental test was not contaminated by voluntary explicit retrieval. We isolated the neural correlates of successful retrieval by contrasting fMRI responses to studied versus unstudied cues for which the equivalent "target" associate was produced. By comparing the difference in this repetition-related contrast across the intentional and incidental tests, we could identify the correlates of voluntary explicit retrieval. This contrast revealed increased bilateral hippocampal responses in the intentional test, but decreased hippocampal responses in the incidental test. A similar pattern in the bilateral amygdale was further modulated by the emotionality of the word pairs, although surprisingly only in the incidental test. Parietal regions, however, showed increased repetition-related responses in both tests. These results suggest that the neural correlates of successful voluntary explicit memory differ in directionality, even if not in location, from the neural correlates of successful involuntary implicit (or explicit) memory, even when the incidental test taps conceptual processes.
Resumo:
During these last decades, the notion of primary intersubjectivity has gained acceptance among developmentalists and clinicians. But a new challenge is put out to our models by recent findings on the triangular competence of the very young infant, or her capacity to simultaneously communicate with two partners at a time. This discovery raises the question of a collective form of intersubjectivity. Findings on the triangular competence of the 3- to 4-month-old interactions with father and mother in different contexts of the Lausanne trilogue play situation are reviewed and illustrated, with a view to examine whether it is based on a dyadic or triangular program and whether conditions for a threesome form of primary intersubjectivity are fulfilled. The discussion focuses on the revisions of the theory of intersubjectivity, of developmental theory, and of clinical practice these findings call for, pointing toward a three -person psychology too.
Resumo:
Ground clutter caused by anomalous propagation (anaprop) can affect seriously radar rain rate estimates, particularly in fully automatic radar processing systems, and, if not filtered, can produce frequent false alarms. A statistical study of anomalous propagation detected from two operational C-band radars in the northern Italian region of Emilia Romagna is discussed, paying particular attention to its diurnal and seasonal variability. The analysis shows a high incidence of anaprop in summer, mainly in the morning and evening, due to the humid and hot summer climate of the Po Valley, particularly in the coastal zone. Thereafter, a comparison between different techniques and datasets to retrieve the vertical profile of the refractive index gradient in the boundary layer is also presented. In particular, their capability to detect anomalous propagation conditions is compared. Furthermore, beam path trajectories are simulated using a multilayer ray-tracing model and the influence of the propagation conditions on the beam trajectory and shape is examined. High resolution radiosounding data are identified as the best available dataset to reproduce accurately the local propagation conditions, while lower resolution standard TEMP data suffers from interpolation degradation and Numerical Weather Prediction model data (Lokal Model) are able to retrieve a tendency to superrefraction but not to detect ducting conditions. Observing the ray tracing of the centre, lower and upper limits of the radar antenna 3-dB half-power main beam lobe it is concluded that ducting layers produce a change in the measured volume and in the power distribution that can lead to an additional error in the reflectivity estimate and, subsequently, in the estimated rainfall rate.
Resumo:
Contamination of weather radar echoes by anomalous propagation (anaprop) mechanisms remains a serious issue in quality control of radar precipitation estimates. Although significant progress has been made identifying clutter due to anaprop there is no unique method that solves the question of data reliability without removing genuine data. The work described here relates to the development of a software application that uses a numerical weather prediction (NWP) model to obtain the temperature, humidity and pressure fields to calculate the three dimensional structure of the atmospheric refractive index structure, from which a physically based prediction of the incidence of clutter can be made. This technique can be used in conjunction with existing methods for clutter removal by modifying parameters of detectors or filters according to the physical evidence for anomalous propagation conditions. The parabolic equation method (PEM) is a well established technique for solving the equations for beam propagation in a non-uniformly stratified atmosphere, but although intrinsically very efficient, is not sufficiently fast to be practicable for near real-time modelling of clutter over the entire area observed by a typical weather radar. We demonstrate a fast hybrid PEM technique that is capable of providing acceptable results in conjunction with a high-resolution terrain elevation model, using a standard desktop personal computer. We discuss the performance of the method and approaches for the improvement of the model profiles in the lowest levels of the troposphere.
Resumo:
High N concentrations in biosolids are one of the strongest reasons for their agricultural use. However, it is essential to understand the fate of N in soils treated with biosolids for both plant nutrition and managing the environmental risk of NO3--N leaching. This work aimed at evaluating the risk of NO3--N leaching from a Spodosol and an Oxisol, each one treated with 0.5-8.0 dry Mg ha-1 of fresh tertiary sewage sludge, composted biosolids, limed biosolids, heat-dried biosolids and solar-irradiated biosolids. Results indicated that under similar application rates NO3--N accumulated up to three times more in the 20 cm topsoil of the Oxisol than the Spodosol. However, a higher water content held at field capacity in the Oxisol compensated for the greater nitrate concentrations. A 20 % NO3--N loss from the root zone in the amended Oxisol could be expected. Depending on the biosolids type, 42 to 76 % of the NO3--N accumulated in the Spodosol could be expected to leach down from the amended 20 cm topsoil. NO3--N expected to leach from the Spodosol ranged from 0.8 (composted sludge) to 3.5 times (limed sludge) the amounts leaching from the Oxisol treated alike. Nevertheless, the risk of NO3--N groundwater contamination as a result of a single biosolids land application at 0.5-8.0 dry Mg ha-1 could be considered low.