36 resultados para Implementation strategies
Resumo:
Earthquakes represent a major hazard for populations around the world, causing frequent loss of life,human suffering and enormous damage to homes, other buildings and infrastructure. The Technology Resources forEarthquake Monitoring and Response (TREMOR) Team of 36 space professionals analysed this problem over thecourse of the International Space University Summer Session Program and published their recommendations in the formof a report. The TREMOR Team proposes a series of space- and ground-based systems to provide improved capabilityto manage earthquakes. The first proposed system is a prototype earthquake early-warning system that improves theexisting knowledge of earthquake precursors and addresses the potential of these phenomena. Thus, the system willat first enable the definitive assessment of whether reliable earthquake early warning is possible through precursormonitoring. Should the answer be affirmative, the system itself would then form the basis of an operational earlywarningsystem. To achieve these goals, the authors propose a multi-variable approach in which the system will combine,integrate and process precursor data from space- and ground-based seismic monitoring systems (already existing andnew proposed systems) and data from a variety of related sources (e.g. historical databases, space weather data, faultmaps). The second proposed system, the prototype earthquake simulation and response system, coordinates the maincomponents of the response phase to reduce the time delays of response operations, increase the level of precisionin the data collected, facilitate communication amongst teams, enhance rescue and aid capabilities and so forth. It isbased in part on an earthquake simulator that will provide pre-event (if early warning is proven feasible) and post-eventdamage assessment and detailed data of the affected areas to corresponding disaster management actors by means of ageographic information system (GIS) interface. This is coupled with proposed mobile satellite communication hubs toprovide links between response teams. Business- and policy-based implementation strategies for these proposals, suchas the establishment of a non-governmental organisation to develop and operate the systems, are included.
Resumo:
In this paper we address the implementation strategies regarding Open Educational Resources within a multicampus setting. A comparison is made between 3 institutions that are taking a very different approach: K.U.Leuven, which is a traditional university, the Open Universiteit (Netherlands) which is in the process of starting up the Network Open Polytechnics, and the Universitat Oberta de Catalunya. We are looking deeper into the pedagogical and organizational issues involved in implementing an OER strategy and show how OER holds the promise of flexible solutions for reaching at first sight very divergent goals.
Resumo:
BACKGROUND: The Cancer Fast-track Programme's aim was to reduce the time that elapsed between well-founded suspicion of breast, colorectal and lung cancer and the start of initial treatment in Catalonia (Spain). We sought to analyse its implementation and overall effectiveness. METHODS: A quantitative analysis of the programme was performed using data generated by the hospitals on the basis of seven fast-track monitoring indicators for the period 2006-2009. In addition, we conducted a qualitative study, based on 83 semistructured interviews with primary and specialised health professionals and health administrators, to obtain their perception of the programme's implementation. RESULTS: About half of all new patients with breast, lung or colorectal cancer were diagnosed via the fast track, though the cancer detection rate declined across the period. Mean time from detection of suspected cancer in primary care to start of initial treatment was 32 days for breast, 30 for colorectal and 37 for lung cancer (2009). Professionals associated with the implementation of the programme showed that general practitioners faced with suspicion of cancer had changed their conduct with the aim of preventing lags. Furthermore, hospitals were found to have pursued three specific implementation strategies (top-down, consensus-based and participatory), which made for the cohesion and sustainability of the circuits. CONCLUSION: The programme has contributed to speeding up diagnostic assessment and treatment of patients with suspicion of cancer, and to clarifying the patient pathway between primary and specialised care.
Resumo:
This paper is concerned with the realism of mechanisms that implementsocial choice functions in the traditional sense. Will agents actually playthe equilibrium assumed by the analysis? As an example, we study theconvergence and stability properties of Sj\"ostr\"om's (1994) mechanism, onthe assumption that boundedly rational players find their way to equilibriumusing monotonic learning dynamics and also with fictitious play. Thismechanism implements most social choice functions in economic environmentsusing as a solution concept the iterated elimination of weakly dominatedstrategies (only one round of deletion of weakly dominated strategies isneeded). There are, however, many sets of Nash equilibria whose payoffs maybe very different from those desired by the social choice function. Withmonotonic dynamics we show that many equilibria in all the sets ofequilibria we describe are the limit points of trajectories that havecompletely mixed initial conditions. The initial conditions that lead tothese equilibria need not be very close to the limiting point. Furthermore,even if the dynamics converge to the ``right'' set of equilibria, it stillcan converge to quite a poor outcome in welfare terms. With fictitious play,if the agents have completely mixed prior beliefs, beliefs and play convergeto the outcome the planner wants to implement.
Implementation of IPM programs on European greenhouse tomato production areas: Tools and constraints
Resumo:
Whiteflies and whitefly-transmitted viruses are some of the major constraints on European tomato production. The main objectives of this study were to: identify where and why whiteflies are a major limitation on tomato crops; collect information about whiteflies and associated viruses; determine the available management tools; and identify key knowledge gaps and research priorities. This study was conducted within the framework of ENDURE (European Network for Durable Exploitation of Crop Protection Strategies). Two whitefly species are the main pests of tomato in Europe: Bemisia tabaci and Trialeurodes vaporariorum. Trialeurodes vaporariorum is widespread to all areas where greenhouse industry is present, and B. tabaci has invaded, since the early 1990’s, all the subtropical and tropical areas. Biotypes B and Q of B. tabaci are widespread and especially problematic. Other key tomato pests are Aculops lycopersici, Helicoverpa armigera, Frankliniella occidentalis, and leaf miners. Tomato crops are particularly susceptible to viruses causingTomato yellow leaf curl disease (TYLCD). High incidences of this disease are associated to high pressure of its vector, B. tabaci. The ranked importance of B. tabaci established in this study correlates with the levels of insecticide use, showing B. tabaci as one of the principal drivers behind chemical control. Confirmed cases of resistance to almost all insecticides have been reported. Integrated Pest Management based on biological control (IPM-BC) is applied in all the surveyed regions and identified as the strategy using fewer insecticides. Other IPM components include greenhouse netting and TYLCD-tolerant tomato cultivars. Sampling techniques differ between regions, where decisions are generally based upon whitefly densities and do not relate to control strategies or growing cycles. For population monitoring and control, whitefly species are always identified. In Europe IPM-BC is the recommended strategy for a sustainable tomato production. The IPM-BC approach is mainly based on inoculative releases of the parasitoids Eretmocerus mundus and Encarsia formosa and/or the polyphagous predators Macrolophus caliginosus and Nesidiocoris tenuis. However, some limitations for a wider implementation have been identified: lack of biological solutions for some pests, costs of beneficials, low farmer confidence, costs of technical advice, and low pest injury thresholds. Research priorities to promote and improve IPM-BC are proposed on the following domains: (i) emergence and invasion of new whitefly-transmitted viruses; (ii) relevance of B. tabaci biotypes regarding insecticide resistance; (iii) biochemistry and genetics of plant resistance; (iv) economic thresholds and sampling techniques of whiteflies for decision making; and (v) conservation and management of native whitefly natural enemies and improvement of biological control of other tomato pests.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
We performed a comprehensive study to assess the fit for purpose of four chromatographic conditions for the determination of six groups of marine lipophilic toxins (okadaic acid and dinophysistoxins, pectenotoxins, azaspiracids, yessotoxins, gymnodimine and spirolides) by LC-MS/MS to select the most suitable conditions as stated by the European Union Reference Laboratory for Marine Biotoxins (EURLMB). For every case, the elution gradient has been optimized to achieve a total run-time cycle of 12 min. We performed a single-laboratory validation for the analysis of three relevant matrices for the seafood aquaculture industry (mussels, pacific oysters and clams), and for sea urchins for which no data about lipophilic toxins have been reported before. Moreover, we have compared the method performance under alkaline conditions using two quantification strategies: the external standard calibration (EXS) and the matrix-matched standard calibration (MMS). Alkaline conditions were the only scenario that allowed detection windows with polarity switching in a 3200 QTrap mass spectrometer, thus the analysis of all toxins can be accomplished in a single run, increasing sample throughput. The limits of quantification under alkaline conditions met the validation requirements established by the EURLMB for all toxins and matrices, while the remaining conditions failed in some cases. The accuracy of the method and the matrix effects where generally dependent on the mobile phases and the seafood species. The MMS had a moderate positive impact on method accuracy for crude extracts, but it showed poor trueness for seafood species other than mussels when analyzing hydrolyzed extracts. Alkaline conditions with EXS and recovery correction for OA were selected as the most proper conditions in the context of our laboratory. This comparative study can help other laboratories to choose the best conditions for the implementation of LC-MS/MS according to their own necessities.
Resumo:
Background To demonstrate the tobacco industry rationale behind the "Spanish model" on non-smokers' protection in hospitality venues and the impact it had on some European and Latin American countries between 2006 and 2011. Methods Tobacco industry documents research triangulated against news and media reports. Results As an alternative to the successful implementation of 100% smoke-free policies, several European and Latin American countries introduced partial smoking bans based on the so-called "Spanish model", a legal framework widely advocated by parts of the hospitality industry with striking similarities to "accommodation programmes" promoted by the tobacco industry in the late 1990s. These developments started with the implementation of the Spanish tobacco control law (Ley 28/2005) in 2006 and have increased since then. Conclusion The Spanish experience demonstrates that partial smoking bans often resemble tobacco industry strategies and are used to spread a failed approach on international level. Researchers, advocates and policy makers should be aware of this ineffective policy.
Resumo:
A change in paradigm is needed in the prevention of toxic effects on the nervous system, moving from its present reliance solely on data from animal testing to a prediction model mostly based on in vitro toxicity testing and in silico modeling. According to the report published by the National Research Council (NRC) of the US National Academies of Science, high-throughput in vitro tests will provide evidence for alterations in"toxicity pathways" as the best possible method of large scale toxicity prediction. The challenges to implement this proposal are enormous, and provide much room for debate. While many efforts address the technical aspects of implementing the vision, many questions around it need also to be addressed. Is the overall strategy the only one to be pursued? How can we move from current to future paradigms? Will we ever be able to reliably model for chronic and developmental neurotoxicity in vitro? This paper summarizes four presentations from a symposium held at the International Neurotoxicology Conference held in Xi"an, China, in June 2011. A. Li reviewed the current guidelines for neurotoxicity and developmental neurotoxicity testing, and discussed the major challenges existing to realize the NCR vision for toxicity testing. J. Llorens reviewed the biology of mammalian toxic avoidance in view of present knowledge on the physiology and molecular biology of the chemical senses, taste and smell. This background information supports the hypothesis that relating in vivo toxicity to chemical epitope descriptors that mimic the chemical encoding performed by the olfactory system may provide a way to the long term future of complete in silico toxicity prediction. S. Ceccatelli reviewed the implementation of rodent and human neural stem cells (NSCs) as models for in vitro toxicity testing that measures parameters such as cell proliferation, differentiation and migration. These appear to be sensitive endpoints that can identify substances with developmental neurotoxic potential. C. Sun ol reviewed the use of primary neuronal cultures in testing for neurotoxicity of environmental pollutants, including the study of the effects of persistent exposures and/or in differentiating cells, which allow recording of effects that can be extrapolated to human developmental neurotoxicity.
Resumo:
Real-time predictions are an indispensable requirement for traffic management in order to be able to evaluate the effects of different available strategies or policies. The combination of predicting the state of the network and the evaluation of different traffic management strategies in the short term future allows system managers to anticipate the effects of traffic control strategies ahead of time in order to mitigate the effect of congestion. This paper presents the current framework of decision support systems for traffic management based on short and medium-term predictions and includes some reflections on their likely evolution, based on current scientific research and the evolution of the availability of new types of data and their associated methodologies.
Resumo:
In this paper, a theoretical framework for analyzing the selection of governance structures for implementing collaboration agreements between firms and Technological Centers is presented and empirically discussed. This framework includes Transaction Costs and Property Rights’ theoretical assumptions, though complemented with several proposals coming from the Transactional Value Theory. This last theory is used for adding some dynamism in the governance structure selection. As empirical evidence of this theoretical explanation, we analyse four real experiences of collaboration between firms and one Technological Center. These experiences are aimed to represent the typology of relationships which Technological Centers usually face. Among others, a key interesting result is obtained: R&D collaboration activities do not need to always be organized through hierarchical solutions. In those cases where future expected benefits and/or reputation issues could play an important role, the traditional more static theories could not fully explain the selected governance structure for managing the R&D relationship. As a consequence, these results justify further research about the adequacy of the theoretical framework presented in this paper in other contexts, for example, R&D collaborations between firms and/or between Universities or Public Research Centers and firms.
Resumo:
We propose a simple mechanism that implements the Ordinal Shapley Value (Pérez-Castrillo and Wettstein [2005]) for economies with three or less agents.
Resumo:
I consider the problem of assigning agents to objects where each agent must pay the price of the object he gets and prices must sum to a given number. The objective is to select an assignment-price pair that is envy-free with respect to the true preferences. I prove that the proposed mechanism will implement both in Nash and strong Nash the set of envy-free allocations. The distinguishing feature of the mechanism is that it treats the announced preferences as the true ones and selects an envy-free allocation with respect to the announced preferences.
Resumo:
In this paper we present a set of axioms guaranteeing that, in exchange economies with or without indivisible goods, the set of Nash, Strong and active Walrasian Equilibria all coincide in the framework of market games.