928 resultados para Iterative Implementation Model
Resumo:
This paper provides a new reading of a classical economic relation: the short-run Phillips curve. Our point is that, when dealing with inflation and unemployment, policy-making can be understood as a multicriteria decisionmaking problem. Hence, we use so-called multiobjective programming in connection with a computable general equilibrium (CGE) model to determine the combinations of policy instruments that provide efficient combinations of inflation and unemployment. This approach results in an alternative version of the Phillips curve labelled as efficient Phillips curve. Our aim is to present an application of CGE models to a new area of research that can be especially useful when addressing policy exercises with real data. We apply our methodological proposal within a particular regional economy, Andalusia, in the south of Spain. This tool can give some keys for policy advice and policy implementation in the fight against unemployment and inflation.
Resumo:
In this paper we present the development and the implementation of a content analysis model for observing aspects relating to the social mission of the public library on Facebook pages and websites. The model is unique and it was developed from the literature. There were designed the four categories for analysis Generate social capital and social cohesion, Consolidate democracy and citizenship, Social and digital inclusion and Fighting illiteracies. The model enabled the collection and the analysis of data applied to a case study consisting of 99 Portuguese public libraries with Facebook page. With this model of content analysis we observed the facets of social mission and we read the actions with social facets on the Facebook page and in the websites of public libraries. At the end we discuss in parallel the results of observation of the Facebook of libraries and the websites. By reading the description of the actions of the social mission, the general conclusion and the most immediate is that 99 public libraries on Facebook and websites rarely publish social character actions, and the results are little satisfying. The Portuguese public libraries highlight substantially the actions in the category Generate social capital and social cohesion.
Resumo:
International audience
Resumo:
The Complex singlet extension of the Standard Model (CxSM) is the simplest extension that provides scenarios for Higgs pair production with different masses. The model has two interesting phases: the dark matter phase, with a Standard Model-like Higgs boson, a new scalar and a dark matter candidate; and the broken phase, with all three neutral scalars mixing. In the latter phase Higgs decays into a pair of two different Higgs bosons are possible. In this study we analyse Higgs-to-Higgs decays in the framework of singlet extensions of the Standard Model (SM), with focus on the CxSM. After demonstrating that scenarios with large rates for such chain decays are possible we perform a comparison between the NMSSM and the CxSM. We find that, based on Higgs-to-Higgs decays, the only possibility to distinguish the two models at the LHC run 2 is through final states with two different scalars. This conclusion builds a strong case for searches for final states with two different scalars at the LHC run 2. Finally, we propose a set of benchmark points for the real and complex singlet extensions to be tested at the LHC run 2. They have been chosen such that the discovery prospects of the involved scalars are maximised and they fulfil the dark matter constraints. Furthermore, for some of the points the theory is stable up to high energy scales. For the computation of the decay widths and branching ratios we developed the Fortran code sHDECAY, which is based on the implementation of the real and complex singlet extensions of the SM in HDECAY.
Resumo:
This paper discusses the results and propositions of organizational knowledge management research conducted in the period 2001-2007. This longitudinal study had the unique goal of investigating and analyzing “Knowledge Management” (KM) processes effectively implemented in world class organizations. The main objective was to investigate and analyze the conceptions, motivations, practices, metrics and results of KM processes implemented in different industries. The first set of studies involved 20 world cases related in the literature and served as a basis for a theoretical framework entitled “KM Integrative Conceptual Mapping Proposition”. This theoretical proposal was then tested in a qualitative study in three large organizations in Brazil. The results of the qualitative study validated the mapping proposition and left questions for new research concerning the implementation of a knowledge-based organizational model strategy.
Resumo:
Part 11: Reference and Conceptual Models
Resumo:
In this paper, we investigate output accuracy for a Discrete Event Simulation (DES) model and Agent Based Simulation (ABS) model. The purpose of this investigation is to find out which of these simulation techniques is the best one for modelling human reactive behaviour in the retail sector. In order to study the output accuracy in both models, we have carried out a validation experiment in which we compared the results from our simulation models to the performance of a real system. Our experiment was carried out using a large UK department store as a case study. We had to determine an efficient implementation of management policy in the store’s fitting room using DES and ABS. Overall, we have found that both simulation models were a good representation of the real system when modelling human reactive behaviour.
Resumo:
Knowledge of the efficacy of an intervention for disease control on an individual farm is essential to make good decisions on preventive healthcare, but the uncertainty in outcome associated with undertaking a specific control strategy has rarely been considered in veterinary medicine. The purpose of this research was to explore the uncertainty in change in disease incidence and financial benefit that could occur on different farms, when two effective farm management interventions are undertaken. Bovine mastitis was used as an example disease and the research was conducted using data from an intervention study as prior information within an integrated Bayesian simulation model. Predictions were made of the reduction in clinical mastitis within 30 days of calving on 52 farms, attributable to the application of two herd interventions previously reported as effective; rotation of dry cow pasture and differential dry cow therapy. Results indicated that there were important degrees of uncertainty in the predicted reduction in clinical mastitis for individual farms when either intervention was undertaken; the magnitude of the 95% credible intervals for reduced clinical mastitis incidence were substantial and of clinical relevance. The large uncertainty associated with the predicted reduction in clinical mastitis attributable to the interventions resulted in important variability in possible financial outcomes for each farm. The uncertainty in outcome associated with farm control measures illustrates the difficulty facing a veterinary clinician when making an on-farm decision and highlights the importance of iterative herd health procedures (continual evaluation, reassessment and adjusted interventions) to optimise health in an individual herd.
Resumo:
Part 6: Engineering and Implementation of Collaborative Networks
Resumo:
The majority of the organizations store their historical business information in data warehouses which are queried to make strategic decisions by using online analytical processing (OLAP) tools. This information has to be correctly assured against unauthorized accesses, but nevertheless there are a great amount of legacy OLAP applications that have been developed without considering security aspects or these have been incorporated once the system was implemented. This work defines a reverse engineering process that allows us to obtain the conceptual model corresponding to a legacy OLAP application, and also analyses and represents the security aspects that could have established. This process has been aligned with a model-driven architecture for developing secure OLAP applications by defining the transformations needed to automatically apply it. Once the conceptual model has been extracted, it can be easily modified and improved with security, and automatically transformed to generate the new implementation.
MINING AND VERIFICATION OF TEMPORAL EVENTS WITH APPLICATIONS IN COMPUTER MICRO-ARCHITECTURE RESEARCH
Resumo:
Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.
Resumo:
Retaining walls are important assets in the transportation infrastructure and assessing their condition is important to prolong their performance and ultimately their design life. Retaining walls are often overlooked and only a few transportation asset management programs consider them in their inventory. Because these programs are few, the techniques used to assess their condition focus on a qualitative assessment as opposed to a quantitative approach. The work presented in this thesis focuses on using photogrammetry to quantitatively assess the condition of retaining walls. Multitemporal photogrammetry is used to develop 3D models of the retaining walls, from which offset displacements are measured to assess their condition. This study presents a case study from a site along M-10 highway in Detroit, MI were several sections of retaining walls have experienced horizontal displacement towards the highway. The results are validated by comparing with field observations and measurements. The limitations of photogrammetry were also studied by using a small scale model in the laboratory. The analysis found that the accuracy of the offset displacement measurements is dependent on the distance between the retaining wall and the sensor, location of the reference points in 3D space, and the focal length of the lenses used by the camera. These parameters were not ideal for the case study at the M-10 highway site, but the results provided consistent trends in the movement of the retaining wall that couldn’t be validated from offset measurements. The findings of this study confirm that photogrammetry shows promise in generating 3D models to provide a quantitative condition assessment for retaining walls within its limitations.
Resumo:
Managed lane strategies are innovative road operation schemes for addressing congestion problems. These strategies operate a lane (lanes) adjacent to a freeway that provides congestion-free trips to eligible users, such as transit or toll-payers. To ensure the successful implementation of managed lanes, the demand on these lanes need to be accurately estimated. Among different approaches for predicting this demand, the four-step demand forecasting process is most common. Managed lane demand is usually estimated at the assignment step. Therefore, the key to reliably estimating the demand is the utilization of effective assignment modeling processes. Managed lanes are particularly effective when the road is functioning at near-capacity. Therefore, capturing variations in demand and network attributes and performance is crucial for their modeling, monitoring and operation. As a result, traditional modeling approaches, such as those used in static traffic assignment of demand forecasting models, fail to correctly predict the managed lane demand and the associated system performance. The present study demonstrates the power of the more advanced modeling approach of dynamic traffic assignment (DTA), as well as the shortcomings of conventional approaches, when used to model managed lanes in congested environments. In addition, the study develops processes to support an effective utilization of DTA to model managed lane operations. Static and dynamic traffic assignments consist of demand, network, and route choice model components that need to be calibrated. These components interact with each other, and an iterative method for calibrating them is needed. In this study, an effective standalone framework that combines static demand estimation and dynamic traffic assignment has been developed to replicate real-world traffic conditions. With advances in traffic surveillance technologies collecting, archiving, and analyzing traffic data is becoming more accessible and affordable. The present study shows how data from multiple sources can be integrated, validated, and best used in different stages of modeling and calibration of managed lanes. Extensive and careful processing of demand, traffic, and toll data, as well as proper definition of performance measures, result in a calibrated and stable model, which closely replicates real-world congestion patterns, and can reasonably respond to perturbations in network and demand properties.
Resumo:
In the presented thesis work, the meshfree method with distance fields was coupled with the lattice Boltzmann method to obtain solutions of fluid-structure interaction problems. The thesis work involved development and implementation of numerical algorithms, data structure, and software. Numerical and computational properties of the coupling algorithm combining the meshfree method with distance fields and the lattice Boltzmann method were investigated. Convergence and accuracy of the methodology was validated by analytical solutions. The research was focused on fluid-structure interaction solutions in complex, mesh-resistant domains as both the lattice Boltzmann method and the meshfree method with distance fields are particularly adept in these situations. Furthermore, the fluid solution provided by the lattice Boltzmann method is massively scalable, allowing extensive use of cutting edge parallel computing resources to accelerate this phase of the solution process. The meshfree method with distance fields allows for exact satisfaction of boundary conditions making it possible to exactly capture the effects of the fluid field on the solid structure.
Resumo:
Efficient numerical models facilitate the study and design of solid oxide fuel cells (SOFCs), stacks, and systems. Whilst the accuracy and reliability of the computed results are usually sought by researchers, the corresponding modelling complexities could result in practical difficulties regarding the implementation flexibility and computational costs. The main objective of this article is to adapt a simple but viable numerical tool for evaluation of our experimental rig. Accordingly, a model for a multi-layer SOFC surrounded by a constant temperature furnace is presented, trained and validated against experimental data. The model consists of a four-layer structure including stand, two interconnects, and PEN (Positive electrode-Electrolyte-Negative electrode); each being approximated by a lumped parameter model. The heating process through the surrounding chamber is also considered. We used a set of V-I characteristics data for parameter adjustment followed by model verification against two independent sets of data. The model results show a good agreement with practical data, offering a significant improvement compared to reduced models in which the impact of external heat loss is neglected. Furthermore, thermal analysis for adiabatic and non-adiabatic process is carried out to capture the thermal behaviour of a single cell followed by a polarisation loss assessment. Finally, model-based design of experiment is demonstrated for a case study.