934 resultados para Scientific consolidation
Resumo:
Identification of ways to enhance consistency and proper entrained air content in hardened concrete pavement has long been a goal of state highway agencies and the Federal Highway Administration. The work performed in this study was done under FHWA Work Order No: DTFH71-97-PTP-IA-47 and referred to as Project HR-1068 by the Iowa DOT. The results of this study indicate that the monitoring devices do provide both the contractor and contracting authority and are a good way of controlling the consistent rate of vibration to achieve a quality concrete pavement product. The devices allow the contractor to monitor vibrator operation effectively and consistently. The equipment proved to be reliable under all weather and paver operating conditions. This type of equipment adds one more way of improving the consistency and quality of the concrete pavement.
Resumo:
This study looks at how increased memory utilisation affects throughput and energy consumption in scientific computing, especially in high-energy physics. Our aim is to minimise energy consumed by a set of jobs without increasing the processing time. The earlier tests indicated that, especially in data analysis, throughput can increase over 100% and energy consumption decrease 50% by processing multiple jobs in parallel per CPU core. Since jobs are heterogeneous, it is not possible to find an optimum value for the number of parallel jobs. A better solution is based on memory utilisation, but finding an optimum memory threshold is not straightforward. Therefore, a fuzzy logic-based algorithm was developed that can dynamically adapt the memory threshold based on the overall load. In this way, it is possible to keep memory consumption stable with different workloads while achieving significantly higher throughput and energy-efficiency than using a traditional fixed number of jobs or fixed memory threshold approaches.
Resumo:
The goal of this project was to provide an objective methodology to support public agencies and railroads in making decisions related to consolidation of at-grade rail-highway crossings. The project team developed a weighted-index method and accompanying Microsoft Excel spreadsheet based tool to help evaluate and prioritize all public highway-rail grade crossings systematically from a possible consolidation impact perspective. Factors identified by stakeholders as critical were traffic volume, heavy-truck traffic volume, proximity to emergency medical services, proximity to schools, road system, and out-of-distance travel. Given the inherent differences between urban and rural locations, factors were considered, and weighted, differently, based on crossing location. Application of a weighted-index method allowed for all factors of interest to be included and for these factors to be ranked independently, as well as weighted according to stakeholder priorities, to create a single index. If priorities change, this approach also allows for factors and weights to be adjusted. The prioritization generated by this approach may be used to convey the need and opportunity for crossing consolidation to decision makers and stakeholders. It may also be used to quickly investigate the feasibility of a possible consolidation. Independently computed crossing risk and relative impact of consolidation may be integrated and compared to develop the most appropriate treatment strategies or alternatives for a highway-rail grade crossing. A crossing with limited- or low-consolidation impact but a high safety risk may be a prime candidate for consolidation. Similarly, a crossing with potentially high-consolidation impact as well as high risk may be an excellent candidate for crossing improvements or grade separation. The results of the highway-rail grade crossing prioritization represent a consistent and quantitative, yet preliminary, assessment. The results may serve as the foundation for more rigorous or detailed analysis and feasibility studies. Other pertinent site-specific factors, such as safety, maintenance costs, economic impacts, and location-specific access and characteristics should be considered.
Resumo:
The objective was to evaluate the usefulness, accuracy, precision, and reproducibility of the second generation CMD for PC concrete under production conditions.
Resumo:
Résumé Ce travail de thèse étudie des moyens de formalisation permettant d'assister l'expert forensique dans la gestion des facteurs influençant l'évaluation des indices scientifiques, tout en respectant des procédures d'inférence établies et acceptables. Selon une vue préconisée par une partie majoritaire de la littérature forensique et juridique - adoptée ici sans réserve comme point de départ - la conceptualisation d'une procédure évaluative est dite 'cohérente' lors qu'elle repose sur une implémentation systématique de la théorie des probabilités. Souvent, par contre, la mise en oeuvre du raisonnement probabiliste ne découle pas de manière automatique et peut se heurter à des problèmes de complexité, dus, par exemple, à des connaissances limitées du domaine en question ou encore au nombre important de facteurs pouvant entrer en ligne de compte. En vue de gérer ce genre de complications, le présent travail propose d'investiguer une formalisation de la théorie des probabilités au moyen d'un environment graphique, connu sous le nom de Réseaux bayesiens (Bayesian networks). L'hypothèse principale que cette recherche envisage d'examiner considère que les Réseaux bayesiens, en concert avec certains concepts accessoires (tels que des analyses qualitatives et de sensitivité), constituent une ressource clé dont dispose l'expert forensique pour approcher des problèmes d'inférence de manière cohérente, tant sur un plan conceptuel que pratique. De cette hypothèse de travail, des problèmes individuels ont été extraits, articulés et abordés dans une série de recherches distinctes, mais interconnectées, et dont les résultats - publiés dans des revues à comité de lecture - sont présentés sous forme d'annexes. D'un point de vue général, ce travail apporte trois catégories de résultats. Un premier groupe de résultats met en évidence, sur la base de nombreux exemples touchant à des domaines forensiques divers, l'adéquation en termes de compatibilité et complémentarité entre des modèles de Réseaux bayesiens et des procédures d'évaluation probabilistes existantes. Sur la base de ces indications, les deux autres catégories de résultats montrent, respectivement, que les Réseaux bayesiens permettent également d'aborder des domaines auparavant largement inexplorés d'un point de vue probabiliste et que la disponibilité de données numériques dites 'dures' n'est pas une condition indispensable pour permettre l'implémentation des approches proposées dans ce travail. Le présent ouvrage discute ces résultats par rapport à la littérature actuelle et conclut en proposant les Réseaux bayesiens comme moyen d'explorer des nouvelles voies de recherche, telles que l'étude de diverses formes de combinaison d'indices ainsi que l'analyse de la prise de décision. Pour ce dernier aspect, l'évaluation des probabilités constitue, dans la façon dont elle est préconisée dans ce travail, une étape préliminaire fondamentale de même qu'un moyen opérationnel.
Resumo:
The final tournament of the UEFA European Football Championship is one of the top sporting events in the world, and a high-profile event of this kind requires a well-planned and well-executed anti-doping programme to ensure the integrity of results in the competition. UEFA EURO 2012 presented a unique logistical challenge, with the tournament spread across two countries, both covering a large geographical area. This paper discusses the planning and delivery of both the pre tournament out-of-competition (OOC) testing programme and the in-competition (IC) programme, as well as reviewing the activities of doping control officers (DCOs), the whereabouts programme and assessing the sample collection and transport process. The analytical approach applied is also discussed, along with an overview of the distribution of T/E ratios and blood parameters.
Resumo:
Background: Mantle cell lymphoma (MCL) is a rare subtype (3-9%) of Non Hodgkin Lymphoma (NHL) with a relatively poor prognosis (5-year survival < 40%). Although consolidation of first remission with autologous stem cell transplantation (ASCT) is regarded as "golden standard", less than half of the patients may be subjected to this intensive treatment due to advanced age and co-morbidities. Standard-dose non-myeloablative radioimmunotherapy (RIT) seems to be a very efficient approach for treatment of certain NHL. However, there are almost no data available on the efficacy and safety of RIT in MCL. Methods and Patients: In the RIT-Network, a web-based international registry collecting real observational data from RIT-treated patients, 115 MCL patients treated with ibritumomab tiuxetan were recorded. Most of the patients were elderly males with advanced stage of the disease: median age - 63 (range 31-78); males - 70.4%, stage III/IV - 92%. RIT (i.e. application of ibritumomab tiuxetan) was a part of the first line therapy in 48 pts. (43%). Further 38 pts. (33%) received ibritumomab tiuxetan after two previous chemotherapy regimens, and 33 pts. (24%) after completing 3-8 lines. In 75 cases RIT was applied as a consolidation of chemotherapy induced response; the rest of the patients received ibritumomab tiuxetan because of relapse/refractory disease. At the moment follow up data are available for 74 MCL patients. Results: After RIT the patients achieved high response rate: CR 60.8%, PR 25.7%, and SD 2.7%. Only 10.8% of the patients progressed. For survival analysis many data had to be censored since the documentation had not been completed yet. The projected 3-year overall survival (OAS, fig.1 - image 001.gif) after radioimmunotherapy was 72% for pts. subjected to RIT consolidation versus 29% for those treated in relapse/refractory disease (p=0.03). RIT was feasible for almost all patients; only 3 procedure-related deaths were reported in the whole group. The main adverse event was hematological toxicity (grade III/IV cytopenias) showing a median time of recovery of Hb, WBC and Plt of 45, 40 and 38 days respectively. Conclusion: Standard-dose non-myeloablative RIT is a feasible and safe treatment modality, even for elderly MCL pts. Consolidation radioimmunotherapy with ibritumomab tiuxetan may prolong survival of patients who achieved clinical response after chemotherapy. Therefore, this consolidation approach should be considered as a treatment strategy for those, who are not eligible for ASCT. RIT also has a potential role as a palliation therapy in relapsing/resistant cases.
Resumo:
In this chapter the tension between the tendency of scientific disciplines to "diversify" and the capacities of universities to give new scientific fields an institutional "home" is tackled. The assumption is that new scientific fields must find support among scientists and cognitive units of universities in order to be included. As science is a strongly competitive social field, inclusion often meets resistance. It is argued in this chapter that opportunities for new scientific fields to be included depend on the kind of governance regimes ruling universities. A comparison of the former bureaucratic-oligarchic governance model in most European universities with the existing new public management governance model demonstrates that the propensity of universities to include new scientific fields has increased though there might be a price to pay in terms of which fields stand a chance of being integrated and in terms of institutional possibilities for the invention of new ideas.
Resumo:
Background : The issue of gender is acknowledged as a key issue for the AIDS epidemic. World AIDS Conferences (WAC) have constituted a major discursive space for the epidemic. We sought to establish the balance regarding gender in the AIDS scientific discourse by following its development in the published proceedings of WAC. Fifteen successive WAC 1989-2012 served to establish a "barometer" of scientific interest in heterosexual and homo/bisexual men and women throughout the epidemic. It was hypothesised that, as in other domains of Sexual and Reproductive Health, heterosexual men would be "forgotten" partners. Method : Abstracts from each conference were entered in electronic form into an Access database. Queries were created to generate five categories of interest and to monitor their annual frequency. All abstract titles including the term "men" or "women" were identified. Collections of synonyms were systematically and iteratively developed in order to classify further abstracts according to whether they included terms referring to "homo/bisexual" or "heterosexual". Reference to "Mother to Child Transmission" (MTCT) was also flagged. Results : The category including "men", but without additional reference to "homo-bisexuel" (i.e. referring to men in general and/or to heterosexual men) consistently appears four times less often than the equivalent category for women. Excluding abstracts on women and MTCT has little impact on this difference. Abstracts including reference to both "men" and "homo-bisexual" emerge as the secondmost frequent category; presence of the equivalent category for women is minimal. Conclusion : The hypothesised absence of heterosexual men in the AIDS discourse was confirmed. Although the relative presence of homo-bisexual men and women as a focal subject may be explained by epidemiological data, this is not so in the case of heterosexual men and women. This imbalance has consequences for HIV prevention.
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.