821 resultados para Quality evaluation
Resumo:
To bridge the gaps between traditional mesoscale modelling and microscale modelling, the National Center for Atmospheric Research, in collaboration with other agencies and research groups, has developed an integrated urban modelling system coupled to the weather research and forecasting (WRF) model as a community tool to address urban environmental issues. The core of this WRF/urban modelling system consists of the following: (1) three methods with different degrees of freedom to parameterize urban surface processes, ranging from a simple bulk parameterization to a sophisticated multi-layer urban canopy model with an indoor–outdoor exchange sub-model that directly interacts with the atmospheric boundary layer, (2) coupling to fine-scale computational fluid dynamic Reynolds-averaged Navier–Stokes and Large-Eddy simulation models for transport and dispersion (T&D) applications, (3) procedures to incorporate high-resolution urban land use, building morphology, and anthropogenic heating data using the National Urban Database and Access Portal Tool (NUDAPT), and (4) an urbanized high-resolution land data assimilation system. This paper provides an overview of this modelling system; addresses the daunting challenges of initializing the coupled WRF/urban model and of specifying the potentially vast number of parameters required to execute the WRF/urban model; explores the model sensitivity to these urban parameters; and evaluates the ability of WRF/urban to capture urban heat islands, complex boundary-layer structures aloft, and urban plume T&D for several major metropolitan regions. Recent applications of this modelling system illustrate its promising utility, as a regional climate-modelling tool, to investigate impacts of future urbanization on regional meteorological conditions and on air quality under future climate change scenarios. Copyright © 2010 Royal Meteorological Society
Resumo:
The quality control, validation and verification of the European Flood Alert System (EFAS) are described. EFAS is designed as a flood early warning system at pan-European scale, to complement national systems and provide flood warnings more than 2 days before a flood. On average 20–30 alerts per year are sent out to the EFAS partner network which consists of 24 National hydrological authorities responsible for transnational river basins. Quality control of the system includes the evaluation of the hits, misses and false alarms, showing that EFAS has more than 50% of the time hits. Furthermore, the skills of both the meteorological as well as the hydrological forecasts are evaluated, and are included here for a 10-year period. Next, end-user needs and feedback are systematically analysed. Suggested improvements, such as real-time river discharge updating, are currently implemented.
Resumo:
Medium range flood forecasting activities, driven by various meteorological forecasts ranging from high resolution deterministic forecasts to low spatial resolution ensemble prediction systems, share a major challenge in the appropriateness and design of performance measures. In this paper possible limitations of some traditional hydrological and meteorological prediction quality and verification measures are identified. Some simple modifications are applied in order to circumvent the problem of the autocorrelation dominating river discharge time-series and in order to create a benchmark model enabling the decision makers to evaluate the forecast quality and the model quality. Although the performance period is quite short the advantage of a simple cost-loss function as a measure of forecast quality can be demonstrated.
Resumo:
High resolution surface wind fields covering the global ocean, estimated from remotely sensed wind data and ECMWF wind analyses, have been available since 2005 with a spatial resolution of 0.25 degrees in longitude and latitude, and a temporal resolution of 6h. Their quality is investigated through various comparisons with surface wind vectors from 190 buoys moored in various oceanic basins, from research vessels and from QuikSCAT scatterometer data taken during 2005-2006. The NCEP/NCAR and NCDC blended wind products are also considered. The comparisons performed during January-December 2005 show that speeds and directions compare well to in-situ observations, including from moored buoys and ships, as well as to the remotely sensed data. The root-mean-squared differences of the wind speed and direction for the new blended wind data are lower than 2m/s and 30 degrees, respectively. These values are similar to those estimated in the comparisons of hourly buoy measurements and QuickSCAT near real time retrievals. At global scale, it is found that the new products compare well with the wind speed and wind vector components observed by QuikSCAT. No significant dependencies on the QuikSCAT wind speed or on the oceanic region considered are evident.Evaluation of high-resolution surface wind products at global and regional scales
Resumo:
This study examines if and how gender relates to research evaluation via panel assessment and journal ratings lists. Using data from UK business schools we find no evidence that the proportion of women in a submission for panel assessment affected the score received by the submitting institution. However, we do find that women on average receive lower scores according to some journal ratings lists. There are important differences in the rated quality of journals that men and women publish in across the sub-disciplines with men publishing significantly more research in the highest rated accountancy, information management and strategy journals. In addition, women who are able to utilise networks to co-author with individuals outside their institution are able to publish in higher-rated journals, although the same is not true for men; women who are attributed with “individual staff circumstances” (e.g. maternity leave or part-time working) have lower scores according to journal ratings lists.
Resumo:
Assessments concerning the effects of climate change, water resource availability and water deprivation in West Africa have not frequently considered the positive contribution to be derived from collecting and reusing water for domestic purposes. Where the originating water is taken from a clean water source and has been used the first time for washing or bathing, this water is commonly called “greywater”. Greywater is a prolific resource that is generated wherever people live. Treated greywater can be used for domestic cleaning, for flushing toilets where appropriate, for washing cars, sometimes for watering kitchen gardens, and for clothes washing prior to rinsing. Therefore, a large theoretical potential exists to increase total water resource availability if greywater were to be widely reused. Locally treated greywater reduces the distribution network requirement, lower construction effort and cost and, wherever possible, minimising the associated carbon footprint. Such locally treated greywater offers significant practical opportunities for increasing the total available water resources at a local level. The reuse of treated greywater is one important action that will help to mitigate the reducing availability of clean water supplies in some areas, and the expected mitigation required in future aligns well with WHO/UNICEF (2012) aspirations. The evaluation of potential opportunities for prioritising greywater systems to support water reuse takes into account the availability of water resources, water use indicators and published estimates in order to understand typical patterns of water demand. The approach supports knowledge acquisition regarding local conditions for enabling capacity building for greywater reuse, the understanding of systems that are most likely to encourage greywater reuse, and practices and future actions to stimulate greywater infrastructure planning, design and implementation. Although reuse might be considered to increase the uncertainty of achieving a specified quality of the water supply, robust methods and technologies are available for local treatment. Resource strategies for greywater reuse have the potential to consistently improve water efficiency and availability in water impoverished and water stressed regions of Ghana and West Africa. Untreated greywater is referred to as “greywater”; treated greywater is referred to as “treated greywater” in this paper.
Resumo:
Quantitative palaeoclimate reconstructions are widely used to evaluate climatemodel performance. Here, as part of an effort to provide such a data set for Australia, we examine the impact of analytical decisions and sampling assumptions on modern-analogue reconstructions using a continent-wide pollen data set. There is a high degree of correlation between temperature variables in the modern climate of Australia, but there is sufficient orthogonality in the variations of precipitation, summer and winter temperature and plant–available moisture to allow independent reconstructions of these four variables to be made. The method of analogue selection does not affect the reconstructions, although bootstrap resampling provides a more reliable technique for obtaining robust measures of uncertainty. The number of analogues used affects the quality of the reconstructions: the most robust reconstructions are obtained using 5 analogues. The quality of reconstructions based on post-1850 CE pollen samples differ little from those using samples from between 1450 and 1849 CE, showing that European post settlement modification of vegetation has no impact on the fidelity of the reconstructions although it substantially increases the availability of potential analogues. Reconstructions based on core top samples are more realistic than those using surface samples, but only using core top samples would substantially reduce the number of available analogues and therefore increases the uncertainty of the reconstructions. Spatial and/or temporal averaging of pollen assemblages prior to analysis negatively affects the subsequent reconstructions for some variables and increases the associated uncertainties. In addition, the quality of the reconstructions is affected by the degree of spatial smoothing of the original climate data, with the best reconstructions obtained using climate data froma 0.5° resolution grid, which corresponds to the typical size of the pollen catchment. This study provides a methodology that can be used to provide reliable palaeoclimate reconstructions for Australia, which will fill in a major gap in the data sets used to evaluate climate models.
Resumo:
This paper aims at assessing the performance of a program of thermal simulation (Arquitrop) in different households in the city of Sao Paulo, Brazil. The households were selected for the Wheezing Project which followed up children under 2 years old to monitor the occurrence of respiratory diseases. The results show that in all three study households there is a good approximation between the observed and the simulated indoor temperatures. It was also observed a fairly consistent and realistic behavior between the simulated indoor and the outdoor temperatures, describing the Arquitrop model as an efficient estimator and good representative of the thermal behavior of households in the city of Sao Paulo. The worst simulation is linked to the poorest type of construction. This may be explained by the bad quality of the construction, which the Architrop could not simulate adequately.
Resumo:
Kidney transplantation improves the quality of life of end-stage renal disease patients. The quality of life benefits, however, pertain to patients on average, not to all transplant recipients. The aim of this study was to identify factors associated with health-related quality of life after kidney transplantation. Population-based study with a cross-sectional design was carried out and quality of life was assessed by SF-36 Health Survey Version 1. A multivariate linear regression model was constructed with sociodemographic, clinical and laboratory data as independent variables. Two hundred and seventy-two kidney recipients with a functioning graft were analyzed. Hypertension, diabetes, higher serum creatinine and lower hematocrit were independently and significantly associated with lower scores for the SF-36 oblique physical component summary (PCSc). The final regression model explained 11% of the PCSc variance. The scores of oblique mental component summary (MCSc) were worse for females, patients with a lower income, unemployed and patients with a higher serum creatinine. The regression model explained 9% of the MCSc variance. Among the studied variables, comorbidity and graft function were the main factors associated with the PCSc, and sociodemographic variables and graft function were the main determinants of MCSc. Despite comprehensive, the final regression models explained only a little part of the heath-related quality of life variance. Additional factors, such as personal, environmental and clinical ones might influence quality of life perceived by the patients after kidney transplantation.
Resumo:
Quantitative reverse-transcription polymerase chain reaction (qRT-PCR) is a standard assay in molecular medicine for gene expression analysis. Samples from incisional/needle biopsies, laser-microdissected tumor cells and other biologic sources, normally available in clinical cancer studies, generate very small amounts of RNA that are restrictive for expression analysis. As a consequence, an RNA amplification procedure is required to assess the gene expression levels of such sample types. The reproducibility and accuracy of relative gene expression data produced by sensitive methodology as qRT-PCR when cDNA converted from amplified (A) RNA is used as template has not yet been properly addressed. In this study, to properly evaluate this issue, we performed 1 round of linear RNA amplification in 2 breast cell lines (C5.2 and HB4a) and assessed the relative expression of 34 genes using cDNA converted from both nonamplified (NA) and A RNA. Relative gene expression was obtained from beta actin or glyceraldehyde 3-phosphate dehydrogenase normalized data using different dilutions of cDNA, wherein the variability and fold-change differences in the expression of the 2 methods were compared. Our data showed that 1 round of linear RNA amplification, even with suboptimal-quality RNA, is appropriate to generate reproducible and high-fidelity qRT-PCR relative expression data that have similar confidence levels as those from NA samples. The use of cDNA that is converted from both A and NA RNA in a single qRT-PCR experiment clearly creates bias in relative gene expression data.
Resumo:
The region of Toledo River, Parana, Brazil is characterized by intense anthropogenic activities. Hence, metal concentrations and physical-chemical parameters of Toledo River water were determined in order to complete an environmental evaluation catalog. Samples were collected monthly during one year period at seven different sites from the source down the river mouth, physical-chemical variables were analyzed, and major metallic ions were measured. Metal analysis was performed by using the synchrotron radiation total reflection X-ray fluorescence technique. A statistical analysis was applied to evaluate the reliability of experimental data. The analysis of obtained results have shown that a strong correlation between physical-chemical parameters existed among sites 1 and 7, suggesting that organic pollutants were mainly responsible for decreasing the Toledo River water quality.
Resumo:
The burning of organic residues and wastes in furnaces of cement industries has been an attractive and lucrative approach to eliminate stocks of these pollutants. There is a potential risk for producing PAH in the workplace of industries burning organic wastes, so that highly sensitive analytical methods are needed for monitoring the air quality of these environments. An official method for determination of PAH is based on liquid chromatography with fluorescence detection at fixed excitation and emission wavelengths. We demonstrate that a suitable choice of these wavelengths, which are changed during the chromatographic run, significantly improves the detectability of PAH in atmosphere and particulate matter collected in cement industries.
Resumo:
If a plastic material is used as a print bearer there are a need of a special surface treatment to get agod and durable printing. The most used surface treatment technique for the moment is coronatreatment. This kind of treatment has unfortunately showed not to be so durable in the long term.Plasma treatment which in this case uses different kind of gases in the treatment of polypropyleneis shown as a more effective treatment in this project. When the plasma treated surface has beenprinted is the good quality last much longer and the adhesion between the ink and the surface isremained. To test this adhesion is for the moment a standard used (ASTM D3359). This standardhas appeared unstable and dependent at many different factors, which gives a big variation in thetest results. Because of this has new test methods been carried out to give a more even and morereliable result in the test of the adhesion.
Resumo:
Internet protocol TV (IPTV) is predicted to be the key technology winner in the future. Efforts to accelerate the deployment of IPTV centralized model which is combined of VHO, encoders, controller, access network and Home network. Regardless of whether the network is delivering live TV, VOD, or Time-shift TV, all content and network traffic resulting from subscriber requests must traverse the entire network from the super-headend all the way to each subscriber's Set-Top Box (STB).IPTV services require very stringent QoS guarantees When IPTV traffic shares the network resources with other traffic like data and voice, how to ensure their QoS and efficiently utilize the network resources is a key and challenging issue. For QoS measured in the network-centric terms of delay jitter, packet losses and bounds on delay. The main focus of this thesis is on the optimized bandwidth allocation and smooth datatransmission. The proposed traffic model for smooth delivering video service IPTV network with its QoS performance evaluation. According to Maglaris et al [5] First, analyze the coding bit rate of a single video source. Various statistical quantities are derived from bit rate data collected with a conditional replenishment inter frame coding scheme. Two correlated Markov process models (one in discrete time and one incontinuous time) are shown to fit the experimental data and are used to model the input rates of several independent sources into a statistical multiplexer. Preventive control mechanism which is to be include CAC, traffic policing used for traffic control.QoS has been evaluated of common bandwidth scheduler( FIFO) by use fluid models with Markovian queuing method and analysis the result by using simulator andanalytically, Which is measured the performance of the packet loss, overflow and mean waiting time among the network users.
Resumo:
In this project, two broad facets in the design of a methodology for performance optimization of indexable carbide inserts were examined. They were physical destructive testing and software simulation.For the physical testing, statistical research techniques were used for the design of the methodology. A five step method which began with Problem definition, through System identification, Statistical model formation, Data collection and Statistical analyses and results was indepthly elaborated upon. Set-up and execution of an experiment with a compression machine together with roadblocks and possible solution to curb road blocks to quality data collection were examined. 2k factorial design was illustrated and recommended for process improvement. Instances of first-order and second-order response surface analyses were encountered. In the case of curvature, test for curvature significance with center point analysis was recommended. Process optimization with method of steepest ascent and central composite design or process robustness studies of response surface analyses were also recommended.For the simulation test, AdvantEdge program was identified as the most used software for tool development. Challenges to the efficient application of this software were identified and possible solutions proposed. In conclusion, software simulation and physical testing were recommended to meet the objective of the project.