876 resultados para Large scale evaluation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The basic assumption of quantitative authorship attribution is that the author of a text can be selected from a set of possible authors by comparing the values of textual measurements in that text to their corresponding values in each possible author's writing sample. Over the past three centuries, many types of textual measurements have been proposed, but never before have the majority of these measurements been tested on the same dataset. A large-scale comparison of textual measurements is crucial if current techniques are to be used effectively and if new and more powerful techniques are to be developed. This article presents the results of a comparison of thirty-nine different types of textual measurements commonly used in attribution studies, in order to determine which are the best indicators of authorship. Based on the results of these tests, a more accurate approach to quantitative authorship attribution is proposed, which involves the analysis of many different textual measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Plant oxylipins are a large family of metabolites derived from polyunsaturated fatty acids. The characterization of mutants or transgenic plants affected in the biosynthesis or perception of oxylipins has recently emphasized the role of the so-called oxylipin pathway in plant defense against pests and pathogens. In this context, presumed functions of oxylipins include direct antimicrobial effect, stimulation of plant defense gene expression, and regulation of plant cell death. However, the precise contribution of individual oxylipins to plant defense remains essentially unknown. To get a better insight into the biological activities of oxylipins, in vitro growth inhibition assays were used to investigate the direct antimicrobial activities of 43 natural oxylipins against a set of 13 plant pathogenic microorganisms including bacteria, oomycetes, and fungi. This study showed unequivocally that most oxylipins are able to impair growth of some plant microbial pathogens, with only two out of 43 oxylipins being completely inactive against all the tested organisms, and 26 oxylipins showing inhibitory activity toward at least three different microbes. Six oxylipins strongly inhibited mycelial growth and spore germination of eukaryotic microbes, including compounds that had not previously been ascribed an antimicrobial activity such as 13-keto-9(Z),11(Z),15(Z)- octadecatrienoic acid and 12-oxo-10,15(Z)-phytodienoic acid. Interestingly this first large-scale comparative assessment of the antimicrobial effects of oxylipins reveals that regulators of plant defense responses are also the most active oxylipins against eukaryotic microorganisms, suggesting that such oxylipins might contribute to plant defense through their effects both on the plant and on pathogens, possibly through related mechanisms. © 2005 American Society of Plant Biologists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A series of alkali-doped metal oxide catalysts were prepared and evaluated for activity in the transesterification of rapeseed oil to biodiesel. Of those evaluated, LiNO3/CaO, NaNO3/CaO, KNO3/CaO and LiNO3/MgO exhibited >90% conversion in a standard 3 h test. There was a clear correlation between base strength and activity. These catalysts appeared to be promising candidates to replace conventional homogeneous catalysts for biodiesel production as the reaction times are low enough to be practical in continuous processes and the preparations are neither prohibitively difficult nor costly. However, metal leaching from the catalyst was detected, and this resulted in some homogeneous activity. This would have to be resolved before these catalysts would be viable for large-scale biodiesel production facilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: To evaluate the implementation of the National Health Service (NHS) Health Check programme in one area of England from the perspective of general practitioners (GPs). DESIGN: A qualitative exploratory study was conducted with GPs and other healthcare professionals involved in delivering the NHS Health Check and with patients. This paper reports the experience of GPs and focuses on the management of the Heath Check programme in primary care. SETTING: Primary care surgeries in the Heart of Birmingham region (now under the auspices of the Birmingham Cross City Clinical Commissioning Group) were invited to take part in the larger scale evaluation. This study focuses on a subset of those surgeries whose GPs were willing to participate. PARTICIPANTS: 9 GPs from different practices volunteered. GPs served an ethnically diverse region with areas of socioeconomic deprivation. Ethnicities of participant GPs included South Asian, South Asian British, white, black British and Chinese. METHODS: Individual semistructured interviews were conducted with GPs face to face or via telephone. Thematic analysis was used to analyse verbatim transcripts. RESULTS: Themes were generated which represent GPs' experiences of managing the NHS Health Check: primary care as a commercial enterprise; 'buy in' to concordance in preventive healthcare; following protocol and support provision. These themes represent the key issues raised by GPs. They reveal variability in the implementation of NHS Health Checks. GPs also need support in allocating resources to the Health Check including training on how to conduct checks in a concordant (or collaborative) way. CONCLUSIONS: The variability observed in this small-scale evaluation corroborates existing findings suggesting a need for more standardisation. Further large-scale research is needed to determine how that could be achieved. Work needs to be done to further develop a concordant approach to lifestyle advice which involves tailored individual goal setting rather than a paternalistic advice-giving model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the global phenomena with threats to environmental health and safety is artisanal mining. There are ambiguities in the manner in which an ore-processing facility operates which hinders the mining capacity of these miners in Ghana. These problems are reviewed on the basis of current socio-economic, health and safety, environmental, and use of rudimentary technologies which limits fair-trade deals to miners. This research sought to use an established data-driven, geographic information (GIS)-based system employing the spatial analysis approach for locating a centralized processing facility within the Wassa Amenfi-Prestea Mining Area (WAPMA) in the Western region of Ghana. A spatial analysis technique that utilizes ModelBuilder within the ArcGIS geoprocessing environment through suitability modeling will systematically and simultaneously analyze a geographical dataset of selected criteria. The spatial overlay analysis methodology and the multi-criteria decision analysis approach were selected to identify the most preferred locations to site a processing facility. For an optimal site selection, seven major criteria including proximity to settlements, water resources, artisanal mining sites, roads, railways, tectonic zones, and slopes were considered to establish a suitable location for a processing facility. Site characterizations and environmental considerations, incorporating identified constraints such as proximity to large scale mines, forest reserves and state lands to site an appropriate position were selected. The analysis was limited to criteria that were selected and relevant to the area under investigation. Saaty’s analytical hierarchy process was utilized to derive relative importance weights of the criteria and then a weighted linear combination technique was applied to combine the factors for determination of the degree of potential site suitability. The final map output indicates estimated potential sites identified for the establishment of a facility centre. The results obtained provide intuitive areas suitable for consideration

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The assessment of adolescent drinking behavior is a complex task, complicated by variability in drinking patterns, the transitory and developmental nature of the behavior and the reliance (for large scale studies) on self-report questionnaires. The Adolescent Alcohol Involvement Scale (Mayer & Filstead, 1979) is a 14-item screening tool designed to help to identify alcohol misusers or more problematic drinkers. The present study utilized a large sample (n = 4066) adolescents from Northern Ireland. Results of Confirmatory Factor Analyses and reliability estimates revealed that the 14-items share sufficient common variance that scores can be considered to be reliable and that the 14 items can be scored to provide a composite alcohol use score.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In today's fast-paced and interconnected digital world, the data generated by an increasing number of applications is being modeled as dynamic graphs. The graph structure encodes relationships among data items, while the structural changes to the graphs as well as the continuous stream of information produced by the entities in these graphs make them dynamic in nature. Examples include social networks where users post status updates, images, videos, etc.; phone call networks where nodes may send text messages or place phone calls; road traffic networks where the traffic behavior of the road segments changes constantly, and so on. There is a tremendous value in storing, managing, and analyzing such dynamic graphs and deriving meaningful insights in real-time. However, a majority of the work in graph analytics assumes a static setting, and there is a lack of systematic study of the various dynamic scenarios, the complexity they impose on the analysis tasks, and the challenges in building efficient systems that can support such tasks at a large scale. In this dissertation, I design a unified streaming graph data management framework, and develop prototype systems to support increasingly complex tasks on dynamic graphs. In the first part, I focus on the management and querying of distributed graph data. I develop a hybrid replication policy that monitors the read-write frequencies of the nodes to decide dynamically what data to replicate, and whether to do eager or lazy replication in order to minimize network communication and support low-latency querying. In the second part, I study parallel execution of continuous neighborhood-driven aggregates, where each node aggregates the information generated in its neighborhoods. I build my system around the notion of an aggregation overlay graph, a pre-compiled data structure that enables sharing of partial aggregates across different queries, and also allows partial pre-computation of the aggregates to minimize the query latencies and increase throughput. Finally, I extend the framework to support continuous detection and analysis of activity-based subgraphs, where subgraphs could be specified using both graph structure as well as activity conditions on the nodes. The query specification tasks in my system are expressed using a set of active structural primitives, which allows the query evaluator to use a set of novel optimization techniques, thereby achieving high throughput. Overall, in this dissertation, I define and investigate a set of novel tasks on dynamic graphs, design scalable optimization techniques, build prototype systems, and show the effectiveness of the proposed techniques through extensive evaluation using large-scale real and synthetic datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A series of related research studies over 15 years assessed the effects of prawn trawling on sessile megabenthos in the Great Barrier Reef, to support management for sustainable use in the World Heritage Area. These large-scale studies estimated impacts on benthos (particularly removal rates per trawl pass), monitored subsequent recovery rates, measured natural dynamics of tagged megabenthos, mapped the regional distribution of seabed habitats and benthic species, and integrated these results in a dynamic modelling framework together with spatio-temporal fishery effort data and simulated management. Typical impact rates were between 5 and 25% per trawl, recovery times ranged from several years to several decades, and most sessile megabenthos were naturally distributed in areas where little or no trawling occurred and so had low exposure to trawling. The model simulated trawl impact and recovery on the mapped species distributions, and estimated the regional scale cumulative changes due to trawling as a time series of status for megabenthos species. The regional status of these taxa at time of greatest depletion ranged from ∼77% relative to pre-trawl abundance for the worst case species, having slow recovery with moderate exposure to trawling, to ∼97% for the least affected taxon. The model also evaluated the expected outcomes for sessile megabenthos in response to major management interventions implemented between 1999 and 2006, including closures, effort reductions, and protected areas. As a result of these interventions, all taxa were predicted to recover (by 2-14% at 2025); the most affected species having relatively greater recovery. Effort reductions made the biggest positive contributions to benthos status for all taxa, with closures making smaller contributions for some taxa. The results demonstrated that management actions have arrested and reversed previous unsustainable trends for all taxa assessed, and have led to a prawn trawl fishery with improved environmental sustainability. © 2015 International Council for the Exploration of the Sea 2015. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

United States federal agencies assess flood risk using Bulletin 17B procedures which assume annual maximum flood series are stationary. This represents a significant limitation of current flood frequency models as the flood distribution is thereby assumed to be unaffected by trends or periodicity of atmospheric/climatic variables and/or anthropogenic activities. The validity of this assumption is at the core of this thesis, which aims to improve understanding of the forms and potential causes of non-stationarity in flood series for moderately impaired watersheds in the Upper Midwest and Northeastern US. Prior studies investigated non-stationarity in flood series for unimpaired watersheds; however, as the majority of streams are located in areas of increasing human activity, relative and coupled impacts of natural and anthropogenic factors need to be considered such that non-stationary flood frequency models can be developed for flood risk forecasting over relevant planning horizons for large scale water resources planning and management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The topic of seismic loss assessment not only incorporates many aspects of the earthquake engineering, but also entails social factors, public policies and business interests. Because of its multidisciplinary character, this process may be complex to challenge, and sound discouraging to neophytes. In this context, there is an increasing need of deriving simplified methodologies to streamline the process and provide tools for decision-makers and practitioners. This dissertation investigates different possible applications both in the area of modelling of seismic losses, both in the analysis of observational seismic data. Regarding the first topic, the PRESSAFE-disp method is proposed for the fast evaluation of the fragility curves of precast reinforced-concrete (RC) structures. Hence, a direct application of the method to the productive area of San Felice is studied to assess the number of collapses under a specific seismic scenario. In particular, with reference to the 2012 events, two large-scale stochastic models are outlined. The outcomes of the framework are promising, in good agreement with the observed damage scenario. Furthermore, a simplified displacement-based methodology is outlined to estimate different loss performance metrics for the decision-making phase of the seismic retrofit of a single RC building. The aim is to evaluate the seismic performance of different retrofit options, for a comparative analysis of their effectiveness and the convenience. Finally, a contribution to the analysis of the observational data is presented in the last part of the dissertation. A specific database of losses of precast RC buildings damaged by the 2012 Earthquake is created. A statistical analysis is performed, allowing deriving several consequence functions. The outcomes presented may be implemented in probabilistic seismic risk assessments to forecast the losses at the large scale. Furthermore, these may be adopted to establish retrofit policies to prevent and reduce the consequences of future earthquakes in industrial areas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract In this paper, we address the problem of picking a subset of bids in a general combinatorial auction so as to maximize the overall profit using the first-price model. This winner determination problem assumes that a single bidding round is held to determine both the winners and prices to be paid. We introduce six variants of biased random-key genetic algorithms for this problem. Three of them use a novel initialization technique that makes use of solutions of intermediate linear programming relaxations of an exact mixed integer-linear programming model as initial chromosomes of the population. An experimental evaluation compares the effectiveness of the proposed algorithms with the standard mixed linear integer programming formulation, a specialized exact algorithm, and the best-performing heuristics proposed for this problem. The proposed algorithms are competitive and offer strong results, mainly for large-scale auctions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: To review the effectiveness of school food and nutrition policies world wide in improving the school food environment, student's dietary intake, and decreasing overweight and obesity. METHODS: Systematic review of published and unpublished literature up to November 2007 of three categories of nutrition policy; nutrition guidelines, regulation of food and/or beverage availability, and price interventions applied in preschools, primary and secondary schools. RESULTS: 18 studies met the inclusion criteria. Most evidence of effectiveness was found for the impact of both nutrition guidelines and price interventions on intake and availability of food and drinks, with less conclusive research on product regulation. Despite the introduction of school food policies worldwide few large scale or national policies have been evaluated, and all included studies were from the USA and Europe. CONCLUSION: Some current school policies have been effective in improving the food environment and dietary intake in schools, but there is little evaluation of their impact on BMI. As schools have been proposed worldwide as a major setting for tackling childhood obesity it is essential that future policy evaluations measure the long term effectiveness of a range of school food policies in tackling both dietary intake and overweight and obesity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Human activities that modify land cover can alter the structure and biogeochemistry of small streams but these effects are poorly known over large regions of the humid tropics where rates of forest clearing are high. We examined how conversion of Amazon lowland tropical forest to cattle pasture influenced the physical and chemical structure, organic matter stocks and N cycling of small streams. We combined a regional ground survey of small streams with an intensive study of nutrient cycling using (15)N additions in three representative streams: a second-order forest stream, a second-order pasture stream and a third-order pasture stream. These three streams were within several km of each other and on similar soils. Replacement of forest with pasture decreased stream habitat complexity by changing streams from run and pool channels with forest leaf detritus (50% cover) to grass-filled (63% cover) channel with runs of slow-moving water. In the survey, pasture streams consistently had lower concentrations of dissolved oxygen and nitrate (NO(3) (-)) compared with similar-sized forest streams. Stable isotope additions revealed that second-order pasture stream had a shorter NH(4) (+) uptake length, higher uptake rates into organic matter components and a shorter (15)NH(4) (+) residence time than the second-order forest stream or the third-order pasture stream. Nitrification was significant in the forest stream (19% of the added (15)NH(4) (+)) but not in the second-order pasture (0%) or third-order (6%) pasture stream. The forest stream retained 7% of added (15)N in organic matter compartments and exported 53% ((15)NH(4) (+) = 34%; (15)NO(3) (-) = 19%). In contrast, the second-order pasture stream retained 75% of added (15)N, predominantly in grasses (69%) and exported only 4% as (15)NH(4) (+). The fate of tracer (15)N in the third-order pasture stream more closely resembled that in the forest stream, with 5% of added N retained and 26% exported ((15)NH(4) (+) = 9%; (15)NO(3) (-) = 6%). These findings indicate that the widespread infilling by grass in small streams in areas deforested for pasture greatly increases the retention of inorganic N in the first- and second-order streams, which make up roughly three-fourths of total stream channel length in Amazon basin watersheds. The importance of this phenomenon and its effect on N transport to larger rivers across the larger areas of the Amazon Basin will depend on better evaluation of both the extent and the scale at which stream infilling by grass occurs, but our analysis suggests the phenomenon is widespread.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The subject of management is renowned for its addiction to fads and fashions. Project Management is no exception. The issue of interest for this paper is the establishment of the 'College of Complex Project Managers' and their 'competency standard for complex project managers.' Both have generated significant interest in the Project Management community, and like any other human endeavour they should be subject to critical evaluation. The results of this evaluation show significant flaws in the definition of complex in this case, the process by which the College and its standard have emerged, and the content of the standard. However, there is a significant case for a portfolio of research that extends the existing bodies of knowledge into large-scale complicated (or major) projects that would be owned by the relevant practitioner communities, rather than focused on one organization. Research questions are proposed that would commence this stream of activity towards an intelligent synthesis of what is required to manage in both complicated and truly complex environments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Meta-analysis is increasingly being employed as a screening procedure in large-scale association studies to select promising variants for follow-up studies. However, standard methods for meta-analysis require the assumption of an underlying genetic model, which is typically unknown a priori. This drawback can introduce model misspecifications, causing power to be suboptimal, or the evaluation of multiple genetic models, which augments the number of false-positive associations, ultimately leading to waste of resources with fruitless replication studies. We used simulated meta-analyses of large genetic association studies to investigate naive strategies of genetic model specification to optimize screenings of genome-wide meta-analysis signals for further replication. Methods Different methods, meta-analytical models and strategies were compared in terms of power and type-I error. Simulations were carried out for a binary trait in a wide range of true genetic models, genome-wide thresholds, minor allele frequencies (MAFs), odds ratios and between-study heterogeneity (tau(2)). Results Among the investigated strategies, a simple Bonferroni-corrected approach that fits both multiplicative and recessive models was found to be optimal in most examined scenarios, reducing the likelihood of false discoveries and enhancing power in scenarios with small MAFs either in the presence or in absence of heterogeneity. Nonetheless, this strategy is sensitive to tau(2) whenever the susceptibility allele is common (MAF epsilon 30%), resulting in an increased number of false-positive associations compared with an analysis that considers only the multiplicative model. Conclusion Invoking a simple Bonferroni adjustment and testing for both multiplicative and recessive models is fast and an optimal strategy in large meta-analysis-based screenings. However, care must be taken when examined variants are common, where specification of a multiplicative model alone may be preferable.