940 resultados para Multi- Choice mixed integer goal programming
Resumo:
A mosaic of two WorldView-2 high resolution multispectral images (Acquisition dates: October 2010 and April 2012), in conjunction with field survey data, was used to create a habitat map of the Danajon Bank, Philippines (10°15'0'' N, 124°08'0'' E) using an object-based approach. To create the habitat map, we conducted benthic cover (seafloor) field surveys using two methods. Firstly, we undertook georeferenced point intercept transects (English et al., 1997). For ten sites we recorded habitat cover types at 1 m intervals on 10 m long transects (n= 2,070 points). Second, we conducted geo-referenced spot check surveys, by placing a viewing bucket in the water to estimate the percent cover benthic cover types (n = 2,357 points). Survey locations were chosen to cover a diverse and representative subset of habitats found in the Danajon Bank. The combination of methods was a compromise between the higher accuracy of point intercept transects and the larger sample area achievable through spot check surveys (Roelfsema and Phinn, 2008, doi:10.1117/12.804806). Object-based image analysis, using the field data as calibration data, was used to classify the image mosaic at each of the reef, geomorphic and benthic community levels. The benthic community level segregated the image into a total of 17 pure and mixed benthic classes.
Resumo:
The Central American Free Trade Agreement (CAFTA) has been a mixed blessing for economic development. While exports to the US economy have increased, dependency may hinder economic growth if countries do not diversify or upgrade before temporary provisions expire. This article evaluates the impact of the temporary Tariff Preference Levels (TPLs) granted to Nicaragua under CAFTA and the consequences of TPL expiration. Using trade statistics, country- and firm-level data from Nicaragua’s National Free Zones Commission (CNZF) and data from field research, we estimate Nicaragua’s apparel sector will contract as much as 30–40% after TPLs expire. Our analysis underscores how rules of origin and firm nationality affect where and how companies do business, and in so doing, often constrain sustainable export growth.
Resumo:
This research examined sex offender risk assessment and management in Ireland. It focused on the statutory agencies with primary responsibility (Garda Síochána and the Probation Service). The goal was to document the historical, contextual and current systems, in addition to identifying areas of concern/improvements. The research was a mixed-methods approach. Eight studies were conducted. This incorporated documentary reviews of four Commission to Inquire Reports, qualitative interviews/focus groups with Garda staff, Probation Service staff, statutory agencies, community stakeholders, various Non-Governmental Organisations (NGOs) and sex offenders. Quantitative questionnaires were also administered to Garda staff. In all over 70 interviews were conducted and questionnaires were forwarded to 270 Garda members. The overall findings are: •Sex offender management in Ireland has become formal only since 2001. Knowledge, skills and expertise is in its infancy and is still evolving. •Mixed reviews and questions regarding fitness for purpose of currently used risk assessments tools were noted. •The Sex Offender Act 2001 requires additional elements to ensure safe sex offender monitoring and public protection. A judicial review of the Sex Offender Act 2001 was recommended by many respondents. •Interagency working under SORAM was hugely welcomed. The sharing of information has been welcomed by managing agencies as the key benefit to improving sex offender management. •Respondents reported that in practice, sex offender management in Ireland is fragmented and unevenly implemented. The research concluded that an independent National Sex Offender Authority should be established as an oversight and regulatory body for policy, strategy and direction in sex offender management. Further areas of research were also highlighted: ongoing evaluation and audits of the joint agency process and systems in place; recidivism studies tracking the risk assessment ratings and subsequent offending; and an evaluation of the current status of sex offender housing in Ireland.
Resumo:
De nombreux problèmes liés aux domaines du transport, des télécommunications et de la logistique peuvent être modélisés comme des problèmes de conception de réseaux. Le problème classique consiste à transporter un flot (données, personnes, produits, etc.) sur un réseau sous un certain nombre de contraintes dans le but de satisfaire la demande, tout en minimisant les coûts. Dans ce mémoire, on se propose d'étudier le problème de conception de réseaux avec coûts fixes, capacités et un seul produit, qu'on transforme en un problème équivalent à plusieurs produits de façon à améliorer la valeur de la borne inférieure provenant de la relaxation continue du modèle. La méthode que nous présentons pour la résolution de ce problème est une méthode exacte de branch-and-price-and-cut avec une condition d'arrêt, dans laquelle nous exploitons à la fois la méthode de génération de colonnes, la méthode de génération de coupes et l'algorithme de branch-and-bound. Ces méthodes figurent parmi les techniques les plus utilisées en programmation linéaire en nombres entiers. Nous testons notre méthode sur deux groupes d'instances de tailles différentes (gran-des et très grandes), et nous la comparons avec les résultats donnés par CPLEX, un des meilleurs logiciels permettant de résoudre des problèmes d'optimisation mathématique, ainsi qu’avec une méthode de branch-and-cut. Il s'est avéré que notre méthode est prometteuse et peut donner de bons résultats, en particulier pour les instances de très grandes tailles.
Resumo:
Public school choice education policy attempts to create an education marketplace. Although school choice research has focused on the parent role in the school choice process, little is known about parents served by low-performing schools. Following market theory, students attending low-performing schools should be the primary students attempting to use school choice policy to access high performing schools rather than moving to a better school. However, students remain in these low-performing schools. This study took place in Miami-Dade County, which offers a wide variety of school choice options through charter schools, magnet schools, and open-choice schools. This dissertation utilized a mixed-methods design to examine the decision-making process and school choice options utilized by the parents of students served by low-performing elementary schools in Miami-Dade County. Twenty-two semi-structured interviews were conducted with the parents of students served by low-performing schools. Binary logistic regression models were fitted to the data to compare the demographic characteristics, academic achievement and distance from alternative schooling options between transfers and non-transfers. Multinomial logistic regression models were fitted to the data to evaluate how demographic characteristics, distance to transfer school, and transfer school grade influenced the type of school a transfer student chose. A geographic analysis was conducted to determine how many miles students lived from alternative schooling options and the miles transfer students lived away from their transfer school. The findings of the interview data illustrated that parents’ perceived needs are not being adequately addressed by state policy and county programs. The statistical analysis found that students from higher socioeconomic social groups were not more likely to transfer than students from lower socioeconomic social groups. Additionally, students who did transfer were not likely to end up at a high achieving school. The findings of the binary logistic regression demonstrated that transfer students were significantly more likely to live near alternative school options.
Resumo:
Shifts in global climate resonate in plankton dynamics, biogeochemical cycles, and marine food webs. We studied these linkages in the North Atlantic subpolar gyre (NASG), which hosts extensive phytoplankton blooms. We show that phytoplankton abundance increased since the 1960s in parallel to a deepening of the mixed layer and a strengthening of winds and heat losses from the ocean, as driven by the low frequency of the North Atlantic Oscillation (NAO). In parallel to these bottom-up processes, the top-down control of phytoplankton by copepods decreased over the same time period in the western NASG, following sea surface temperature changes typical of the Atlantic Multi-decadal Oscillation (AMO). While previous studies have hypothesized that climate-driven warming would facilitate seasonal stratification of surface waters and long-term phytoplankton increase in subpolar regions, here we show that deeper mixed layers in the NASG can be warmer and host a higher phytoplankton biomass. These results emphasize that different modes of climate variability regulate bottom-up (NAO control) and top-down (AMO control) forcing on phytoplankton at decadal timescales. As a consequence, different relationships between phytoplankton, zooplankton, and their physical environment appear subject to the disparate temporal scale of the observations (seasonal, interannual, or decadal). The prediction of phytoplankton response to climate change should be built upon what is learnt from observations at the longest timescales.
Resumo:
Shifts in global climate resonate in plankton dynamics, biogeochemical cycles, and marine food webs. We studied these linkages in the North Atlantic subpolar gyre (NASG), which hosts extensive phytoplankton blooms. We show that phytoplankton abundance increased since the 1960s in parallel to a deepening of the mixed layer and a strengthening of winds and heat losses from the ocean, as driven by the low frequency of the North Atlantic Oscillation (NAO). In parallel to these bottom-up processes, the top-down control of phytoplankton by copepods decreased over the same time period in the western NASG, following sea surface temperature changes typical of the Atlantic Multi-decadal Oscillation (AMO). While previous studies have hypothesized that climate-driven warming would facilitate seasonal stratification of surface waters and long-term phytoplankton increase in subpolar regions, here we show that deeper mixed layers in the NASG can be warmer and host a higher phytoplankton biomass. These results emphasize that different modes of climate variability regulate bottom-up (NAO control) and top-down (AMO control) forcing on phytoplankton at decadal timescales. As a consequence, different relationships between phytoplankton, zooplankton, and their physical environment appear subject to the disparate temporal scale of the observations (seasonal, interannual, or decadal). The prediction of phytoplankton response to climate change should be built upon what is learnt from observations at the longest timescales.
Resumo:
Background: The move toward evidence-based education has led to increasing numbers of randomised trials in schools. However, the literature on recruitment to non-clinical trials is relatively underdeveloped, when compared to that of clinical trials. Recruitment to school-based randomised trials is, however, challenging; even more so when the focus of the study is a sensitive issue such as sexual health. This article reflects on the challenges of recruiting post-primary schools, adolescent pupils and parents to a cluster randomised feasibility trial of a sexual health intervention, and the strategies employed to address them.
Methods: The Jack Trial was funded by the UK National Institute for Health Research (NIHR). It comprised a feasibility study of an interactive film-based sexual health intervention entitled If I Were Jack, recruiting over 800 adolescents from eight socio-demographically diverse post-primary schools in Northern Ireland. It aimed to determine the facilitators and barriers to recruitment and retention to a school-based sexual health trial and identify optimal multi-level strategies for an effectiveness study. As part of an embedded process evaluation, we conducted semi-structured interviews and focus groups with principals, vice-principals, teachers, pupils and parents recruited to the study as well as classroom observations and a parents’ survey.
Results: With reference to Social Learning Theory, we identified a number of individual, behavioural and environmental level factors which influenced recruitment. Commonly identified facilitators included perceptions of the relevance and potential benefit of the intervention to adolescents, the credibility of the organisation and individuals running the study, support offered by trial staff, and financial incentives. Key barriers were prior commitment to other research, lack of time and resources, and perceptions that the intervention was incompatible with pupil or parent needs or the school ethos.
Conclusions: Reflecting on the methodological challenges of recruiting to a school-based sexual health feasibility trial, this study highlights pertinent general and trial-specific facilitators and barriers to recruitment, which will prove useful for future trials with schools, adolescent pupils and parents.
Resumo:
In a team of multiple agents, the pursuance of a common goal is a defining characteristic. Since agents may have different capabilities, and effects of actions may be uncertain, a common goal can generally only be achieved through a careful cooperation between the different agents. In this work, we propose a novel two-stage planner that combines online planning at both team level and individual level through a subgoal delegation scheme. The proposal brings the advantages of online planning approaches to the multi-agent setting. A number of modifications are made to a classical UCT approximate algorithm to (i) adapt it to the application domains considered, (ii) reduce the branching factor in the underlying search process, and (iii) effectively manage uncertain information of action effects by using information fusion mechanisms. The proposed online multi-agent planner reduces the cost of planning and decreases the temporal cost of reaching a goal, while significantly increasing the chance of success of achieving the common goal.
Resumo:
Tests for dependence of continuous, discrete and mixed continuous-discrete variables are ubiquitous in science. The goal of this paper is to derive Bayesian alternatives to frequentist null hypothesis significance tests for dependence. In particular, we will present three Bayesian tests for dependence of binary, continuous and mixed variables. These tests are nonparametric and based on the Dirichlet Process, which allows us to use the same prior model for all of them. Therefore, the tests are “consistent” among each other, in the sense that the probabilities that variables are dependent computed with these tests are commensurable across the different types of variables being tested. By means of simulations with artificial data, we show the effectiveness of the new tests.
Resumo:
Conventional taught learning practices often experience difficulties in keeping students motivated and engaged. Video games, however, are very successful at sustaining high levels of motivation and engagement through a set of tasks for hours without apparent loss of focus. In addition, gamers solve complex problems within a gaming environment without feeling fatigue or frustration, as they would typically do with a comparable learning task. Based on this notion, the academic community is keen on exploring methods that can deliver deep learner engagement and has shown increased interest in adopting gamification – the integration of gaming elements, mechanics, and frameworks into non-game situations and scenarios – as a means to increase student engagement and improve information retention. Its effectiveness when applied to education has been debatable though, as attempts have generally been restricted to one-dimensional approaches such as transposing a trivial reward system onto existing teaching materials and/or assessments. Nevertheless, a gamified, multi-dimensional, problem-based learning approach can yield improved results even when applied to a very complex and traditionally dry task like the teaching of computer programming, as shown in this paper. The presented quasi-experimental study used a combination of instructor feedback, real time sequence of scored quizzes, and live coding to deliver a fully interactive learning experience. More specifically, the “Kahoot!” Classroom Response System (CRS), the classroom version of the TV game show “Who Wants To Be A Millionaire?”, and Codecademy’s interactive platform formed the basis for a learning model which was applied to an entry-level Python programming course. Students were thus allowed to experience multiple interlocking methods similar to those commonly found in a top quality game experience. To assess gamification’s impact on learning, empirical data from the gamified group were compared to those from a control group who was taught through a traditional learning approach, similar to the one which had been used during previous cohorts. Despite this being a relatively small-scale study, the results and findings for a number of key metrics, including attendance, downloading of course material, and final grades, were encouraging and proved that the gamified approach was motivating and enriching for both students and instructors.
Resumo:
After having elective percutaneous coronary intervention (PCI) patients are expected to self-manage their coronary heart disease (CHD) by modifying their risk factors, adhering to medication and effectively managing any recurring angina symptoms but that may be ineffective. Objective: Explore how patients self-manage their coronary heart disease (CHD) after elective PCI and identify any factors that may infl uence that. Design and method: This mixed methods study recruited a convenience sample of patients (n=93) approximately three months after elective PCI. Quantitative data were collected using a survey and were subject to univariate, bivariate and multi-variate analysis. Qualitative data from participant interviews was analysed using thematic analysis. Findings: After PCI, 74% of participants managed their angina symptoms inappropriately. Younger participants and those with threatening perceptions of their CHD were more likely to know how to effectively manage their angina symptoms. Few patients adopted a healthier lifestyle after PCI. Qualitative analysis revealed that intentional non-adherence to some medicines was an issue. Some participants felt unsupported by healthcare providers and social networks in relation to their self-management. Participants reported strong emotional responses to CHD and this had a detrimental effect on their self-management. Few patients accessed cardiac rehabilitation.
Resumo:
Resumo:
Background: Among other causes the long-term result of hip prostheses in dogs is determined by aseptic loosening. A prevention of prosthesis complications can be achieved by an optimization of the tribological system which finally results in improved implant duration. In this context a computerized model for the calculation of hip joint loadings during different motions would be of benefit. In a first step in the development of such an inverse dynamic multi-body simulation (MBS-) model we here present the setup of a canine hind limb model applicable for the calculation of ground reaction forces. Methods: The anatomical geometries of the MBS-model have been established using computer tomography- (CT-) and magnetic resonance imaging- (MRI-) data. The CT-data were collected from the pelvis, femora, tibiae and pads of a mixed-breed adult dog. Geometric information about 22 muscles of the pelvic extremity of 4 mixed-breed adult dogs was determined using MRI. Kinematic and kinetic data obtained by motion analysis of a clinically healthy dog during a gait cycle (1 m/s) on an instrumented treadmill were used to drive the model in the multi-body simulation. Results and Discussion: As a result the vertical ground reaction forces (z-direction) calculated by the MBS-system show a maximum deviation of 1.75%BW for the left and 4.65%BW for the right hind limb from the treadmill measurements. The calculated peak ground reaction forces in z- and y-direction were found to be comparable to the treadmill measurements, whereas the curve characteristics of the forces in y-direction were not in complete alignment. Conclusion: In conclusion, it could be demonstrated that the developed MBS-model is suitable for simulating ground reaction forces of dogs during walking. In forthcoming investigations the model will be developed further for the calculation of forces and moments acting on the hip joint during different movements, which can be of help in context with the in silico development and testing of hip prostheses.
Resumo:
When designing systems that are complex, dynamic and stochastic in nature, simulation is generally recognised as one of the best design support technologies, and a valuable aid in the strategic and tactical decision making process. A simulation model consists of a set of rules that define how a system changes over time, given its current state. Unlike analytical models, a simulation model is not solved but is run and the changes of system states can be observed at any point in time. This provides an insight into system dynamics rather than just predicting the output of a system based on specific inputs. Simulation is not a decision making tool but a decision support tool, allowing better informed decisions to be made. Due to the complexity of the real world, a simulation model can only be an approximation of the target system. The essence of the art of simulation modelling is abstraction and simplification. Only those characteristics that are important for the study and analysis of the target system should be included in the simulation model. The purpose of simulation is either to better understand the operation of a target system, or to make predictions about a target system’s performance. It can be viewed as an artificial white-room which allows one to gain insight but also to test new theories and practices without disrupting the daily routine of the focal organisation. What you can expect to gain from a simulation study is very well summarised by FIRMA (2000). His idea is that if the theory that has been framed about the target system holds, and if this theory has been adequately translated into a computer model this would allow you to answer some of the following questions: · Which kind of behaviour can be expected under arbitrarily given parameter combinations and initial conditions? · Which kind of behaviour will a given target system display in the future? · Which state will the target system reach in the future? The required accuracy of the simulation model very much depends on the type of question one is trying to answer. In order to be able to respond to the first question the simulation model needs to be an explanatory model. This requires less data accuracy. In comparison, the simulation model required to answer the latter two questions has to be predictive in nature and therefore needs highly accurate input data to achieve credible outputs. These predictions involve showing trends, rather than giving precise and absolute predictions of the target system performance. The numerical results of a simulation experiment on their own are most often not very useful and need to be rigorously analysed with statistical methods. These results then need to be considered in the context of the real system and interpreted in a qualitative way to make meaningful recommendations or compile best practice guidelines. One needs a good working knowledge about the behaviour of the real system to be able to fully exploit the understanding gained from simulation experiments. The goal of this chapter is to brace the newcomer to the topic of what we think is a valuable asset to the toolset of analysts and decision makers. We will give you a summary of information we have gathered from the literature and of the experiences that we have made first hand during the last five years, whilst obtaining a better understanding of this exciting technology. We hope that this will help you to avoid some pitfalls that we have unwittingly encountered. Section 2 is an introduction to the different types of simulation used in Operational Research and Management Science with a clear focus on agent-based simulation. In Section 3 we outline the theoretical background of multi-agent systems and their elements to prepare you for Section 4 where we discuss how to develop a multi-agent simulation model. Section 5 outlines a simple example of a multi-agent system. Section 6 provides a collection of resources for further studies and finally in Section 7 we will conclude the chapter with a short summary.