964 resultados para real-scale battery
Resumo:
Este documento es un artículo inédito que ha sido aceptado para su publicación. Como un servicio a sus autores y lectores, Alternativas. Cuadernos de trabajo social proporciona online esta edición preliminar. El manuscrito puede sufrir alteraciones tras la edición y corrección de pruebas, antes de su publicación definitiva. Los posibles cambios no afectarán en ningún caso a la información contenida en esta hoja, ni a lo esencial del contenido del artículo.
Resumo:
This layer is a georeferenced raster image of the historic paper map entitled: The Survey Districts of North Harbour & Blueskin, Lower Harbour West, North East Valley, Upper Harbour West, Tomahawk, Sawyers Bay, Andersons Bay, Portobello Bay, Otago Peninsula & Upper Harbour East, drawn by G.P. Wilson, April 1896. It was published by N.Z. Lands and Survey in 1896. Covers the Dunedin region, New Zealand. Scale [ca. 1:63,360]. The image inside the map neatline is georeferenced to the surface of the earth and fit to the Universal Transverse Mercator (UTM Zone 59S, meters, WGS 1984) projected coordinate system. All map collar and inset information is also available as part of the raster image, including any inset maps, profiles, statistical tables, directories, text, illustrations, index maps, legends, or other information associated with the principal map. This map shows features such as property lot and block numbers, boundaries of survey districts and blocks, boroughs, townships and estates, drainage, selected roads, railroads and stations, selected buildings and industry locations, cemeteries, shoreline features, docks and wharves, and more. Relief shown by spot heights.This layer is part of a selection of digitally scanned and georeferenced historic maps from the Harvard Map Collection. These maps typically portray both natural and manmade features. The selection represents a range of originators, ground condition dates, scales, and map purposes.
Resumo:
Tese de doutoramento, Psicologia (Psicologia da Educação), Universidade de Lisboa, 2016
Resumo:
Aims The relationship between biodiversity and ecosystem functioning is among the most active areas of ecological research. Furthermore, enhancing the diversity of degraded ecosystems is a major goal in applied restoration ecology. In grasslands, many species may be locally absent due to dispersal or microsite limitation and may therefore profit from mechanical disturbance of the resident vegetation. We established a seed addition and disturbance experiment across several grassland sites of different land use to test whether plant diversity can be increased in these grasslands. Additionally, the experiment will allow us testing the consequences of increased plant diversity for ecosystem processes and for the diversity of other taxa in real-world ecosystems. Here we present details of the experimental design and report results from the first vegetation survey one year after disturbance and seed addition. Moreover, we tested whether the effects of seed addition and disturbance varied among grassland depending on their land use or pre-disturbance plant diversity. Methods A full-factorial experiment was installed in 73 grasslands in three regions across Germany. Grasslands were under regular agricultural use, but varied in the type and the intensity of management, thereby representing the range of management typical for large parts of Central Europe. The disturbance treatment consisted of disturbing the top 10 cm of the sward using a rotavator or rotary harrow. Seed addition consisted of sowing a high-diversity seed mixture of regional plant species. These species were all regionally present, but often locally absent, depending on the resident vegetation composition and richness of each grassland. Important findings One year after sward disturbance it had significantly increased cover of bare soil, seedling species richness and numbers of seedlings. Seed addition had increased plant species richness, but only in combination with sward disturbance. The increase in species richness, when both seed addition and disturbance was applied, was higher at high land-use intensity and low resident diversity. Thus, we show that at least the early recruitment of many species is possible also at high land-use intensity, indicating the potential to restore and enhance biodiversity of species-poor agricultural grasslands. Our newly established experiment provides a unique platform for broad-scale research on the land-use dependence of future trajectories of vegetation diversity and composition and their effects on ecosystem functioning.
Resumo:
As part of the Governor's effort to streamline State government through improvements in the efficiency and effectiveness of operations, Executive Order 2004-06 ("EO6") provided for the reorganization (consolidation) of the Department of Insurance, Office of Banks and Real Estate, Department of Professional Regulation and Department of Financial Institutions. Through EO6 the four predecessor Agencies were abolished and a single new agency, The Department of Financial and Professional Regulation (hereafter referred to as "IDFPR") was created. The purpose of the consolidation of the four regulatory agencies was to allow for certain economies of scale to be realized primarily within the executive management and administrative functions. Additionally, the consolidation would increases the effectiveness of operations through the integration of certain duplicative functions within the four predecessor agencies without the denegration of the frontline functions. Beginning on or about July 1, 2004, the IDFPR began consolidation activities focusing primarily on the administrative functions of Executive Management, Fiscal and Accounting, General Counsel, Human Resources, Information Technology and Other Administrative Services. The underlying premise of the reorganization was that all improvements could be accomplished without the denegration of the frontline functions of the predecessor agencies. Accordingly, all powers, duties, rights, responsibilities and functions of the predecessor agencies migrated to IDFPR and the reorganization activities commenced July 1, 2004.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
The potential for large-scale use of a sensitive real time reverse transcription polymerase chain reaction (RT-PCR) assay was evaluated for the detection of Tomato spotted wilt virus (TSWV) in single and bulked leaf samples by comparing its sensitivity with that of DAS-ELISA. Using total RNA extracted with RNeasy (R) or leaf soak methods, real time RT-PCR detected TSWV in all infected samples collected from 16 horticultural crop species (including flowers, herbs and vegetables), two arable crop species, and four weed species by both assays. In samples in which DAS-ELISA had previously detected TSWV, real time RT-PCR was effective at detecting it in leaf tissues of all 22 plant species tested at a wide range of concentrations. Bulk samples required more robust and extensive extraction methods with real time RT-PCR, but it generally detected one infected sample in 1000 uninfected ones. By contrast, ELISA was less sensitive when used to test bulked samples, once detecting up to I infected in 800 samples with pepper but never detecting more than I infected in 200 samples in tomato and lettuce. It was also less reliable than real time RT-PCR when used to test samples from parts of the leaf where the virus concentration was low. The genetic variability among Australian isolates of TSWV was small. Direct sequencing of a 587 bp region of the nucleoprotein gene (S RNA) of 29 isolates from diverse crops and geographical locations yielded a maximum of only 4.3% nucleotide sequence difference. Phylogenetic analysis revealed no obvious groupings of isolates according to geographic origin or host species. TSWV isolates, that break TSWV resistance genes in tomato or pepper did not differ significantly in the N gene region studied, indicating that a different region of the virus genome is responsible for this trait.
Resumo:
This paper presents results from the first use of neural networks for the real-time feedback control of high temperature plasmas in a Tokamak fusion experiment. The Tokamak is currently the principal experimental device for research into the magnetic confinement approach to controlled fusion. In the Tokamak, hydrogen plasmas, at temperatures of up to 100 Million K, are confined by strong magnetic fields. Accurate control of the position and shape of the plasma boundary requires real-time feedback control of the magnetic field structure on a time-scale of a few tens of microseconds. Software simulations have demonstrated that a neural network approach can give significantly better performance than the linear technique currently used on most Tokamak experiments. The practical application of the neural network approach requires high-speed hardware, for which a fully parallel implementation of the multi-layer perceptron, using a hybrid of digital and analogue technology, has been developed.
Resumo:
Purpose – Academic writing is often considered to be a weakness in contemporary students, while good reporting and writing skills are highly valued by graduate employers. A number of universities have introduced writing centres aimed at addressing this problem; however, the evaluation of such centres is usually qualitative. The paper seeks to consider the efficacy of a writing centre by looking at the impact of attendance on two “real world” quantitative outcomes – achievement and progression. Design/methodology/approach – Data mining was used to obtain records of 806 first-year students, of whom 45 had attended the writing centre and 761 had not. Findings – A highly significant association between writing centre attendance and achievement was found. Progression to year two was also significantly associated with writing centre attendance. Originality/value – Further, quantitative evaluation of writing centres is advocated using random allocation to a comparison condition to control for potential confounds such as motivation.
Resumo:
National meteorological offices are largely concerned with synoptic-scale forecasting where weather predictions are produced for a whole country for 24 hours ahead. In practice, many local organisations (such as emergency services, construction industries, forestry, farming, and sports) require only local short-term, bespoke, weather predictions and warnings. This thesis shows that the less-demanding requirements do not require exceptional computing power and can be met by a modern, desk-top system which monitors site-specific ground conditions (such as temperature, pressure, wind speed and direction, etc) augmented with above ground information from satellite images to produce `nowcasts'. The emphasis in this thesis has been towards the design of such a real-time system for nowcasting. Local site-specific conditions are monitored using a custom-built, stand alone, Motorola 6809 based sub-system. Above ground information is received from the METEOSAT 4 geo-stationary satellite using a sub-system based on a commercially available equipment. The information is ephemeral and must be captured in real-time. The real-time nowcasting system for localised weather handles the data as a transparent task using the limited capabilities of the PC system. Ground data produces a time series of measurements at a specific location which represents the past-to-present atmospheric conditions of the particular site from which much information can be extracted. The novel approach adopted in this thesis is one of constructing stochastic models based on the AutoRegressive Integrated Moving Average (ARIMA) technique. The satellite images contain features (such as cloud formations) which evolve dynamically and may be subject to movement, growth, distortion, bifurcation, superposition, or elimination between images. The process of extracting a weather feature, following its motion and predicting its future evolution involves algorithms for normalisation, partitioning, filtering, image enhancement, and correlation of multi-dimensional signals in different domains. To limit the processing requirements, the analysis in this thesis concentrates on an `area of interest'. By this rationale, only a small fraction of the total image needs to be processed, leading to a major saving in time. The thesis also proposes an extention to an existing manual cloud classification technique for its implementation in automatically classifying a cloud feature over the `area of interest' for nowcasting using the multi-dimensional signals.
Resumo:
This article considers two contrasting approaches to reforming public services in order to meet the needs of people living in poverty. The first approach is top-down, involves categorising individuals (as 'hard to help', 'at risk', etc) and invokes scientific backing for justification. The second approach is bottom-up, emancipatory, relates to people as individuals and treats people who have experience of poverty and social exclusion as experts. The article examines each approach through providing brief examples in the fields of unemployment and parenting policy - two fields that have been central to theories of 'cycles of deprivation'. It is suggested here that the two approaches differ in terms of their scale, type of user involvement and type of evidence that is used for their legitimation. While the article suggests that direct comparison between the two approaches is difficult, it highlights the prevalence of top-down approaches towards services for people living in poverty, despite increasing support for bottom-up approaches in other policy areas.
Resumo:
Menorrhagia, or heavy menstrual bleeding (HMB), is a common gynaecological condition. As the aim of treatment is to improve women's wellbeing and quality of life (QoL), it is necessary to have effective ways to measure this. This study investigated the reliability and validity of the menorrhagia multi-attribute scale (MMAS), a menorrhagia-specific QoL instrument. Participants (n = 431) completed the MMAS and a battery of other tests as part of the baseline assessment of the ECLIPSE (Effectiveness and Cost-effectiveness of Levonorgestrel-containing Intrauterine system in Primary care against Standard trEatment for menorrhagia) trial. Analyses of their responses suggest that the MMAS has good measurement properties and is therefore an appropriate condition-specific instrument to measure the outcome of treatment for HMB. © 2011 The Authors BJOG An International Journal of Obstetrics and Gynaecology © 2011 RCOG.
Resumo:
In a Data Envelopment Analysis model, some of the weights used to compute the efficiency of a unit can have zero or negligible value despite of the importance of the corresponding input or output. This paper offers an approach to preventing inputs and outputs from being ignored in the DEA assessment under the multiple input and output VRS environment, building on an approach introduced in Allen and Thanassoulis (2004) for single input multiple output CRS cases. The proposed method is based on the idea of introducing unobserved DMUs created by adjusting input and output levels of certain observed relatively efficient DMUs, in a manner which reflects a combination of technical information and the decision maker's value judgements. In contrast to many alternative techniques used to constrain weights and/or improve envelopment in DEA, this approach allows one to impose local information on production trade-offs, which are in line with the general VRS technology. The suggested procedure is illustrated using real data. © 2011 Elsevier B.V. All rights reserved.
Resumo:
In the face of global population growth and the uneven distribution of water supply, a better knowledge of the spatial and temporal distribution of surface water resources is critical. Remote sensing provides a synoptic view of ongoing processes, which addresses the intricate nature of water surfaces and allows an assessment of the pressures placed on aquatic ecosystems. However, the main challenge in identifying water surfaces from remotely sensed data is the high variability of spectral signatures, both in space and time. In the last 10 years only a few operational methods have been proposed to map or monitor surface water at continental or global scale, and each of them show limitations. The objective of this study is to develop and demonstrate the adequacy of a generic multi-temporal and multi-spectral image analysis method to detect water surfaces automatically, and to monitor them in near-real-time. The proposed approach, based on a transformation of the RGB color space into HSV, provides dynamic information at the continental scale. The validation of the algorithm showed very few omission errors and no commission errors. It demonstrates the ability of the proposed algorithm to perform as effectively as human interpretation of the images. The validation of the permanent water surface product with an independent dataset derived from high resolution imagery, showed an accuracy of 91.5% and few commission errors. Potential applications of the proposed method have been identified and discussed. The methodology that has been developed 27 is generic: it can be applied to sensors with similar bands with good reliability, and minimal effort. Moreover, this experiment at continental scale showed that the methodology is efficient for a large range of environmental conditions. Additional preliminary tests over other continents indicate that the proposed methodology could also be applied at the global scale without too many difficulties
Resumo:
GraphChi is the first reported disk-based graph engine that can handle billion-scale graphs on a single PC efficiently. GraphChi is able to execute several advanced data mining, graph mining and machine learning algorithms on very large graphs. With the novel technique of parallel sliding windows (PSW) to load subgraph from disk to memory for vertices and edges updating, it can achieve data processing performance close to and even better than those of mainstream distributed graph engines. GraphChi mentioned that its memory is not effectively utilized with large dataset, which leads to suboptimal computation performances. In this paper we are motivated by the concepts of 'pin ' from TurboGraph and 'ghost' from GraphLab to propose a new memory utilization mode for GraphChi, which is called Part-in-memory mode, to improve the GraphChi algorithm performance. The main idea is to pin a fixed part of data inside the memory during the whole computing process. Part-in-memory mode is successfully implemented with only about 40 additional lines of code to the original GraphChi engine. Extensive experiments are performed with large real datasets (including Twitter graph with 1.4 billion edges). The preliminary results show that Part-in-memory mode memory management approach effectively reduces the GraphChi running time by up to 60% in PageRank algorithm. Interestingly it is found that a larger portion of data pinned in memory does not always lead to better performance in the case that the whole dataset cannot be fitted in memory. There exists an optimal portion of data which should be kept in the memory to achieve the best computational performance.