983 resultados para design-build,
Resumo:
La présente étude est à la fois une évaluation du processus de la mise en oeuvre et des impacts de la police de proximité dans les cinq plus grandes zones urbaines de Suisse - Bâle, Berne, Genève, Lausanne et Zurich. La police de proximité (community policing) est à la fois une philosophie et une stratégie organisationnelle qui favorise un partenariat renouvelé entre la police et les communautés locales dans le but de résoudre les problèmes relatifs à la sécurité et à l'ordre public. L'évaluation de processus a analysé des données relatives aux réformes internes de la police qui ont été obtenues par l'intermédiaire d'entretiens semi-structurés avec des administrateurs clés des cinq départements de police, ainsi que dans des documents écrits de la police et d'autres sources publiques. L'évaluation des impacts, quant à elle, s'est basée sur des variables contextuelles telles que des statistiques policières et des données de recensement, ainsi que sur des indicateurs d'impacts construit à partir des données du Swiss Crime Survey (SCS) relatives au sentiment d'insécurité, à la perception du désordre public et à la satisfaction de la population à l'égard de la police. Le SCS est un sondage régulier qui a permis d'interroger des habitants des cinq grandes zones urbaines à plusieurs reprises depuis le milieu des années 1980. L'évaluation de processus a abouti à un « Calendrier des activités » visant à créer des données de panel permettant de mesurer les progrès réalisés dans la mise en oeuvre de la police de proximité à l'aide d'une grille d'évaluation à six dimensions à des intervalles de cinq ans entre 1990 et 2010. L'évaluation des impacts, effectuée ex post facto, a utilisé un concept de recherche non-expérimental (observational design) dans le but d'analyser les impacts de différents modèles de police de proximité dans des zones comparables à travers les cinq villes étudiées. Les quartiers urbains, délimités par zone de code postal, ont ainsi été regroupés par l'intermédiaire d'une typologie réalisée à l'aide d'algorithmes d'apprentissage automatique (machine learning). Des algorithmes supervisés et non supervisés ont été utilisés sur les données à haute dimensionnalité relatives à la criminalité, à la structure socio-économique et démographique et au cadre bâti dans le but de regrouper les quartiers urbains les plus similaires dans des clusters. D'abord, les cartes auto-organisatrices (self-organizing maps) ont été utilisées dans le but de réduire la variance intra-cluster des variables contextuelles et de maximiser simultanément la variance inter-cluster des réponses au sondage. Ensuite, l'algorithme des forêts d'arbres décisionnels (random forests) a permis à la fois d'évaluer la pertinence de la typologie de quartier élaborée et de sélectionner les variables contextuelles clés afin de construire un modèle parcimonieux faisant un minimum d'erreurs de classification. Enfin, pour l'analyse des impacts, la méthode des appariements des coefficients de propension (propensity score matching) a été utilisée pour équilibrer les échantillons prétest-posttest en termes d'âge, de sexe et de niveau d'éducation des répondants au sein de chaque type de quartier ainsi identifié dans chacune des villes, avant d'effectuer un test statistique de la différence observée dans les indicateurs d'impacts. De plus, tous les résultats statistiquement significatifs ont été soumis à une analyse de sensibilité (sensitivity analysis) afin d'évaluer leur robustesse face à un biais potentiel dû à des covariables non observées. L'étude relève qu'au cours des quinze dernières années, les cinq services de police ont entamé des réformes majeures de leur organisation ainsi que de leurs stratégies opérationnelles et qu'ils ont noué des partenariats stratégiques afin de mettre en oeuvre la police de proximité. La typologie de quartier développée a abouti à une réduction de la variance intra-cluster des variables contextuelles et permet d'expliquer une partie significative de la variance inter-cluster des indicateurs d'impacts avant la mise en oeuvre du traitement. Ceci semble suggérer que les méthodes de géocomputation aident à équilibrer les covariables observées et donc à réduire les menaces relatives à la validité interne d'un concept de recherche non-expérimental. Enfin, l'analyse des impacts a révélé que le sentiment d'insécurité a diminué de manière significative pendant la période 2000-2005 dans les quartiers se trouvant à l'intérieur et autour des centres-villes de Berne et de Zurich. Ces améliorations sont assez robustes face à des biais dus à des covariables inobservées et covarient dans le temps et l'espace avec la mise en oeuvre de la police de proximité. L'hypothèse alternative envisageant que les diminutions observées dans le sentiment d'insécurité soient, partiellement, un résultat des interventions policières de proximité semble donc être aussi plausible que l'hypothèse nulle considérant l'absence absolue d'effet. Ceci, même si le concept de recherche non-expérimental mis en oeuvre ne peut pas complètement exclure la sélection et la régression à la moyenne comme explications alternatives. The current research project is both a process and impact evaluation of community policing in Switzerland's five major urban areas - Basel, Bern, Geneva, Lausanne, and Zurich. Community policing is both a philosophy and an organizational strategy that promotes a renewed partnership between the police and the community to solve problems of crime and disorder. The process evaluation data on police internal reforms were obtained through semi-structured interviews with key administrators from the five police departments as well as from police internal documents and additional public sources. The impact evaluation uses official crime records and census statistics as contextual variables as well as Swiss Crime Survey (SCS) data on fear of crime, perceptions of disorder, and public attitudes towards the police as outcome measures. The SCS is a standing survey instrument that has polled residents of the five urban areas repeatedly since the mid-1980s. The process evaluation produced a "Calendar of Action" to create panel data to measure community policing implementation progress over six evaluative dimensions in intervals of five years between 1990 and 2010. The impact evaluation, carried out ex post facto, uses an observational design that analyzes the impact of the different community policing models between matched comparison areas across the five cities. Using ZIP code districts as proxies for urban neighborhoods, geospatial data mining algorithms serve to develop a neighborhood typology in order to match the comparison areas. To this end, both unsupervised and supervised algorithms are used to analyze high-dimensional data on crime, the socio-economic and demographic structure, and the built environment in order to classify urban neighborhoods into clusters of similar type. In a first step, self-organizing maps serve as tools to develop a clustering algorithm that reduces the within-cluster variance in the contextual variables and simultaneously maximizes the between-cluster variance in survey responses. The random forests algorithm then serves to assess the appropriateness of the resulting neighborhood typology and to select the key contextual variables in order to build a parsimonious model that makes a minimum of classification errors. Finally, for the impact analysis, propensity score matching methods are used to match the survey respondents of the pretest and posttest samples on age, gender, and their level of education for each neighborhood type identified within each city, before conducting a statistical test of the observed difference in the outcome measures. Moreover, all significant results were subjected to a sensitivity analysis to assess the robustness of these findings in the face of potential bias due to some unobserved covariates. The study finds that over the last fifteen years, all five police departments have undertaken major reforms of their internal organization and operating strategies and forged strategic partnerships in order to implement community policing. The resulting neighborhood typology reduced the within-cluster variance of the contextual variables and accounted for a significant share of the between-cluster variance in the outcome measures prior to treatment, suggesting that geocomputational methods help to balance the observed covariates and hence to reduce threats to the internal validity of an observational design. Finally, the impact analysis revealed that fear of crime dropped significantly over the 2000-2005 period in the neighborhoods in and around the urban centers of Bern and Zurich. These improvements are fairly robust in the face of bias due to some unobserved covariate and covary temporally and spatially with the implementation of community policing. The alternative hypothesis that the observed reductions in fear of crime were at least in part a result of community policing interventions thus appears at least as plausible as the null hypothesis of absolutely no effect, even if the observational design cannot completely rule out selection and regression to the mean as alternative explanations.
Resumo:
Assessing whether the climatic niche of a species may change between different geographic areas or time periods has become increasingly important in the context of ongoing global change. However, approaches and findings have remained largely controversial so far, calling for a unification of methods. Here, we build on a review of empirical studies of invasion to formalize a unifying framework that decomposes niche change into unfilling, stability, and expansion situations, taking both a pooled range and range-specific perspective on the niche, while accounting for climatic availability and climatic analogy. This framework provides new insights into the nature of climate niche shifts and our ability to anticipate invasions, and may help in guiding the design of experiments for assessing causes of niche changes.
Resumo:
We apply majorization theory to study the quantum algorithms known so far and find that there is a majorization principle underlying the way they operate. Grover's algorithm is a neat instance of this principle where majorization works step by step until the optimal target state is found. Extensions of this situation are also found in algorithms based in quantum adiabatic evolution and the family of quantum phase-estimation algorithms, including Shor's algorithm. We state that in quantum algorithms the time arrow is a majorization arrow.
Resumo:
The purpose of this manual is to provide design guidelines for low water stream crossings (LWSCs). Rigid criteria for determining the applicability of a LWSC to a given site are not established since each site is unique in terms of physical, social, economic, and political factors. Because conditions vary from county to county, it is not the intent to provide a "cook-book" procedure for designing a LWSC. Rather, engineering judgment must be applied to the guidelines contained in this manual.
Resumo:
Most counties have bridges that are no longer adequate, and are faced with large capital expenditure for replacement structures of the same size. In this regard, low water stream crossings (LWSCs) can provide an acceptable, low cost alternative to bridges and culverts on low volume and reduced maintenance level roads. In addition to providing a low cost option for stream crossings, LWSCs have been designed to have the additional benefit of stream bed stabilization. Considerable information on the current status of LWSCs in Iowa, along with insight of needs for design assistance, was gained from a survey of county engineers that was conducted as part of this research (Appendix A). Copies of responses and analysis are included in Appendix B. This document provides guidelines for the design of LWSCs. There are three common types of LWSCs: unvented ford, vented ford with pipes, and low water bridges. Selection among these depends on stream geometry, discharge, importance of road, and budget availability. To minimize exposure to tort liability, local agencies using low water stream crossings should consider adopting reasonable selection and design criteria and certainly provide adequate warning of these structures to road users. The design recommendations included in this report for LWSCs provide guidelines and suggestions for local agency reference. Several design examples of design calculations are included in Appendix E.
Resumo:
In the November 2011 report issued by the Governor’s Transportation 2020 Citizen Advisory Commission (CAC), the commission recommended the Iowa Department of Transportation (DOT), at least annually, convene meetings with the cities and counties to review the operation, maintenance and improvement of Iowa’s public roadway system to identify ways to jointly increase efficiency. In response to this recommendation, Gov. Branstad directed the Iowa DOT to begin this effort immediately with a target of identifying $50 million of efficiency savings that can be captured from the $1.2 billion of Road Use Tax Funds (RUTF) provided to the Iowa DOT, cities and counties to administer, maintain and improve the public roadway system. This would build upon past joint and individual actions that have reduced administrative costs and resulted in increased funding for system improvements. Efficiency actions should be quantified, measured and reported to the public on a regular basis. Beyond the discussion of identifying funding solutions to our road and bridge needs, it is critical that all jurisdictions that own, maintain and improve the nation’s road and bridge systems demonstrate to the public these funds are utilized in the most efficient and effective manner. This requires continual innovation in all aspects of transportation planning, design, construction and maintenance - done in a transparent manner to clearly demonstrate to the public how their funds are being utilized. The Iowa DOT has identified 13 efficiency measures separated into two distinct categories – Program Efficiencies and Partnership Efficiencies. The total value of the efficiency measures is $50 million. Many of the efficiency items will need input, refinement and partnership from cities, counties, other local jurisdictions, and stakeholder interest groups. The Iowa DOT has begun meetings with many of these groups to help identify potential efficiency measures and strategies for moving forward. These partnerships and discussions will continue through implementation of the efficiency measures. Dependent on the measures identified, additional action may be required by the legislature, Iowa Transportation Commission, and/or other bodies to implement the action. In addition, a formal process will be developed to quantify, measure and report the results of actions taken on a regular basis.
Resumo:
Summary
Resumo:
The possibility of local elastic instabilities is considered in a first¿order structural phase transition, typically a thermoelastic martensitic transformation, with associated interfacial and volumic strain energy. They appear, for instance, as the result of shape change accommodation by simultaneous growth of different crystallographic variants. The treatment is phenomenological and deals with growth in both thermoelastic equilibrium and in nonequilibrium conditions produced by the elastic instability. Scaling of the transformed fraction curves against temperature is predicted only in the case of purely thermoelastic growth. The role of the transformation latent heat on the relaxation kinetics is also considered, and it is shown that it tends to increase the characteristic relaxation times as adiabatic conditions are approached, by keeping the system closer to a constant temperature. The analysis also reveals that the energy dissipated in the relaxation process has a double origin: release of elastic energy Wi and entropy production Si. The latter is shown to depend on both temperature rate and thermal conduction in the system.
Resumo:
Modeling concentration-response function became extremely popular in ecotoxicology during the last decade. Indeed, modeling allows determining the total response pattern of a given substance. However, reliable modeling is consuming in term of data, which is in contradiction with the current trend in ecotoxicology, which aims to reduce, for cost and ethical reasons, the number of data produced during an experiment. It is therefore crucial to determine experimental design in a cost-effective manner. In this paper, we propose to use the theory of locally D-optimal designs to determine the set of concentrations to be tested so that the parameters of the concentration-response function can be estimated with high precision. We illustrated this approach by determining the locally D-optimal designs to estimate the toxicity of the herbicide dinoseb on daphnids and algae. The results show that the number of concentrations to be tested is often equal to the number of parameters and often related to the their meaning, i.e. they are located close to the parameters. Furthermore, the results show that the locally D-optimal design often has the minimal number of support points and is not much sensitive to small changes in nominal values of the parameters. In order to reduce the experimental cost and the use of test organisms, especially in case of long-term studies, reliable nominal values may therefore be fixed based on prior knowledge and literature research instead of on preliminary experiments
Resumo:
OBJECTIVE: Intervention during the pre-psychotic period of illness holds the potential of delaying or even preventing the onset of a full-threshold disorder, or at least of reducing the impact of such a disorder if it does develop. The first step in realizing this aim was achieved more than 10 years ago with the development and validation of criteria for the identification of young people at ultra-high risk (UHR) of psychosis. Results of three clinical trials have been published that provide mixed support for the effectiveness of psychological and pharmacological interventions in preventing the onset of psychotic disorder. METHOD: The present paper describes a fourth study that has now been undertaken in which young people who met UHR criteria were randomized to one of three treatment groups: cognitive therapy plus risperidone (CogTher + Risp: n = 43); cognitive therapy plus placebo (CogTher + Placebo: n = 44); and supportive counselling + placebo (Supp + Placebo; n = 28). A fourth group of young people who did not agree to randomization were also followed up (monitoring: n = 78). Baseline characteristics of participants are provided. RESULTS AND CONCLUSION: The present study improves on the previous studies because treatment was provided for 12 months and the independent contributions of psychological and pharmacological treatments in preventing transition to psychosis in the UHR cohort and on levels of psychopathology and functioning can be directly compared. Issues associated with recruitment and randomization are discussed.
Resumo:
A practical activity designed to introduce wavefront coding techniques as a method to extend the depth of field in optical systems is presented. The activity is suitable for advanced undergraduate students since it combines different topics in optical engineering such as optical system design, aberration theory, Fourier optics, and digital image processing. This paper provides the theoretical background and technical information for performing the experiment. The proposed activity requires students able to develop a wide range of skills since they are expected to deal with optical components, including spatial light modulators, and develop scripts to perform some calculations.
Resumo:
A method of making a multiple matched filter which allows the recognition of different characters in successive planes in simple conditions is proposed. The generation of the filter is based on recording on the same plate the Fourier transforms of the different patterns to be recognized, each of which is affected by different spherical phase factors because the patterns have been placed at different distances from the lens. This is proved by means of experiments with a triple filter which allows satisfactory recognition of three characters.
Resumo:
We describe the design, calibration, and performance of surface forces apparatus with the capability of illumination of the contact interface for spectroscopic investigation using optical techniques. The apparatus can be placed in the path of a Nd-YAG laser for studies of the linear response or the second harmonic and sum-frequency generation from a material confined between the two surfaces. In addition to the standard fringes of equal chromatic order technique, which we have digitized for accurate and fast analysis, the distance of separation can be measured with a fiber-optic interferometer during spectroscopic measurements (2 Å resolution and 10 ms response time). The sample approach is accomplished through application of a motor drive, piezoelectric actuator, or electromagnetic lever deflection for variable degrees of range, sensitivity, and response time. To demonstrate the operation of the instrument, the stepwise expulsion of discrete layers of octamethylcyclotetrasiloxane from the contact is shown. Lateral forces may also be studied by using piezoelectric bimorphs to induce and direct the motion of one surface.
Resumo:
BACKGROUND AND OBJECTIVES: The SBP values to be achieved by antihypertensive therapy in order to maximize reduction of cardiovascular outcomes are unknown; neither is it clear whether in patients with a previous cardiovascular event, the optimal values are lower than in the low-to-moderate risk hypertensive patients, or a more cautious blood pressure (BP) reduction should be obtained. Because of the uncertainty whether 'the lower the better' or the 'J-curve' hypothesis is correct, the European Society of Hypertension and the Chinese Hypertension League have promoted a randomized trial comparing antihypertensive treatment strategies aiming at three different SBP targets in hypertensive patients with a recent stroke or transient ischaemic attack. As the optimal level of low-density lipoprotein cholesterol (LDL-C) level is also unknown in these patients, LDL-C-lowering has been included in the design. PROTOCOL DESIGN: The European Society of Hypertension-Chinese Hypertension League Stroke in Hypertension Optimal Treatment trial is a prospective multinational, randomized trial with a 3 × 2 factorial design comparing: three different SBP targets (1, <145-135; 2, <135-125; 3, <125 mmHg); two different LDL-C targets (target A, 2.8-1.8; target B, <1.8 mmol/l). The trial is to be conducted on 7500 patients aged at least 65 years (2500 in Europe, 5000 in China) with hypertension and a stroke or transient ischaemic attack 1-6 months before randomization. Antihypertensive and statin treatments will be initiated or modified using suitable registered agents chosen by the investigators, in order to maintain patients within the randomized SBP and LDL-C windows. All patients will be followed up every 3 months for BP and every 6 months for LDL-C. Ambulatory BP will be measured yearly. OUTCOMES: Primary outcome is time to stroke (fatal and non-fatal). Important secondary outcomes are: time to first major cardiovascular event; cognitive decline (Montreal Cognitive Assessment) and dementia. All major outcomes will be adjudicated by committees blind to randomized allocation. A Data and Safety Monitoring Board has open access to data and can recommend trial interruption for safety. SAMPLE SIZE CALCULATION: It has been calculated that 925 patients would reach the primary outcome after a mean 4-year follow-up, and this should provide at least 80% power to detect a 25% stroke difference between SBP targets and a 20% difference between LDL-C targets.