911 resultados para Early Warning and Nowcasting Approaches for Water Quality in Riverine and Coastal Systems
Resumo:
The performance, carcass traits and finishing costs of Suffolk lambs were evaluated in three systems: (1) lambs weaned with 22 kg of body weight (BW) and supplemented with concentrate on pasture until slaughter; (2) lambs weaned with 22 kg BW and fed in feedlot until slaughter; (3) lambs maintained in controlled nursing after 22 kg BW and creep fed in feedlot until slaughter. Average daily gain (ADG) was 224 g/d for lambs weaned and supplemented with concentrate on pasture, 386 g/d for lambs weaned in feedlot and 481 g/d for lambs under controlled nursing. Empty body weight and visceral fat deposition were highest in lambs from feedlot systems. Carcass weights and carcass yields were highest for lambs in controlled nursing. Finishing total costs were highest in controlled nursing and lowest in the system with weaning in feedlot. High concentrate diet associated with controlled nursing in feedlot allowed lambs to reach the growth potential and carcasses with higher weights, higher yields and higher fat content. After weaning, lambs in feedlot fed with high concentrate diet had higher weight gain than lambs supplemented with concentrate on pasture. Carcasses produced under these two systems presented the same characteristics. The system with weaning in feedlot showed the lowest cost per kg carcass.
Resumo:
Websites are, nowadays, the face of institutions, but they are often neglected, especially when it comes to contents. In the present paper, we put forth an investigation work whose final goal is the development of a model for the measurement of data quality in institutional websites for health units. To that end, we have carried out a bibliographic review of the available approaches for the evaluation of website content quality, in order to identify the most recurrent dimensions and the attributes, and we are currently carrying out a Delphi Method process, presently in its second stage, with the purpose of reaching an adequate set of attributes for the measurement of content quality.
Resumo:
This article presents a research work, the goal of which was to achieve a model for the evaluation of data quality in institutional websites of health units in a broad and balanced way. We have carried out a literature review of the available approaches for the evaluation of website content quality, in order to identify the most recurrent dimensions and the attributes, and we have also carried out a Delphi method process with experts in order to reach an adequate set of attributes and their respective weights for the measurement of content quality. The results obtained revealed a high level of consensus among the experts who participated in the Delphi process. On the other hand, the different statistical analysis and techniques implemented are robust and attach confidence to our results and consequent model obtained.
Resumo:
One Plus Sequential Air Sampler—Partisol was placed in a small village (Foros de Arrão) in central Portugal to collect PM10 (particles with an aerodynamic diameter below 10 μm), during the winter period for 3 months (December 2009–March 2010). Particles masses were gravimetrically determined and the filters were analyzed by instrumental neutron activation analysis to assess their chemical composition. The water-soluble ion compositions of the collected particles were determined by Ion-exchange Chromatography. Principal component analysis was applied to the data set of chemical elements and soluble ions to assess the main sources of the air pollutants. The use of both analytical techniques provided information about elemental solubility, such as for potassium, which was important to differentiate sources.
Resumo:
OBJECTIVE: To examine the relationship between growth patterns in early childhood and the onset of menarche before age 12. METHODS: The study included 2,083 women from a birth cohort study conducted in the city of Pelotas, Southern Brazil, starting in 1982. Anthropometric, behavioral, and pregnancy-related variables were collected through home interviews. Statistical analyses were performed using Pearson's chi-square and chi-square test for linear trends. A multivariable analysis was carried out using Poisson regression based on a hierarchical model. RESULTS: Mean age of menarche was 12.4 years old and the prevalence of menarche before age 12 was 24.3%. Higher weight-for-age, height-for-age, and weight-for-height z-scores at 19.4 and 43.1 months of age were associated with linear tendencies of increased prevalence and relative risks of the onset of menarche before age 12. Girls who experienced rapid growth in weight-for-age z-score from birth to 19.4 months of age and in weight-for-age or height-for-age z-scores from 19.4 to 43.1 months of age also showed higher risk of menarche before age 12. Higher risk was seen when rapid growth in weight-for-age z-score was seen during these age intervals and the highest risk was found among those in the first tertile of Williams' curve at birth. Rapid growth in weight-for-height z-score was not associated with menarche before age 12. CONCLUSIONS: Menarche is affected by nutritional status and growth patterns during early childhood. Preventing overweight and obesity during early childhood and keeping a "normal" growth pattern seem crucial for the prevention of health conditions during adulthood.
Resumo:
The discovery of X-rays was undoubtedly one of the greatest stimulus for improving the efficiency in the provision of healthcare services. The ability to view, non-invasively, inside the human body has greatly facilitated the work of professionals in diagnosis of diseases. The exclusive focus on image quality (IQ), without understanding how they are obtained, affect negatively the efficiency in diagnostic radiology. The equilibrium between the benefits and the risks are often forgotten. It is necessary to adopt optimization strategies to maximize the benefits (image quality) and minimize risk (dose to the patient) in radiological facilities. In radiology, the implementation of optimization strategies involves an understanding of images acquisition process. When a radiographer adopts a certain value of a parameter (tube potential [kVp], tube current-exposure time product [mAs] or additional filtration), it is essential to know its meaning and impact of their variation in dose and image quality. Without this, any optimization strategy will be a failure. Worldwide, data show that use of x-rays has been increasingly frequent. In Cabo Verde, we note an effort by healthcare institutions (e.g. Ministry of Health) in equipping radiological facilities and the recent installation of a telemedicine system requires purchase of new radiological equipment. In addition, the transition from screen-films to digital systems is characterized by a raise in patient exposure. Given that this transition is slower in less developed countries, as is the case of Cabo Verde, the need to adopt optimization strategies becomes increasingly necessary. This study was conducted as an attempt to answer that need. Although this work is about objective evaluation of image quality, and in medical practice the evaluation is usually subjective (visual evaluation of images by radiographer / radiologist), studies reported a correlation between these two types of evaluation (objective and subjective) [5-7] which accredits for conducting such studies. The purpose of this study is to evaluate the effect of exposure parameters (kVp and mAs) when using additional Cooper (Cu) filtration in dose and image quality in a Computed Radiography system.
Resumo:
The Tagus estuary is bordered by the largest metropolitan area in Portugal, the Lisbon capital city council. It has suffered the impact of several major tsunamis in the past, as shown by a recent revision of the catalogue of tsunamis that struck the Portuguese coast over the past two millennia. Hence, the exposure of populations and infrastructure established along the riverfront comprises a critical concern for the civil protection services. The main objectives of this work are to determine critical inundation areas in Lisbon and to quantify the associated severity through a simple index derived from the local maximum of momentum flux per unit mass and width. The employed methodology is based on the mathematical modelling of a tsunami propagating along the estuary, resembling the one occurred on the 1 November of 1755 that followed the 8.5 M-w Great Lisbon Earthquake. The employed simulation tool was STAV-2D, a shallow-flow solver coupled with conservation equations for fine solid phases, and now featuring the novelty of discrete Lagrangian tracking of large debris. Different sets of initial conditions were studied, combining distinct tidal, atmospheric and fluvial scenarios, so that the civil protection services were provided with comprehensive information to devise public warning and alert systems and post-event mitigation intervention. For the most severe scenario, the obtained results have shown a maximum inundation extent of 1.29 km at the AlcA cent ntara valley and water depths reaching nearly 10 m across Lisbon's riverfront.
Resumo:
Text based on the paper presented at the Conference "Autonomous systems: inter-relations of technical and societal issues" held at Monte de Caparica (Portugal), Universidade Nova de Lisboa, November, 5th and 6th 2009 and organized by IET-Research Centre on Enterprise and Work Innovation
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics and Maastricht University School of Business and Economics
Resumo:
Worldwide aging of the human population has promoted an increase in the incidence of neoplasia, including hematological cancers, which render patients particularly vulnerable to invasive fungal infections. For this reason, air filtration in hemato-oncology units has been recommended. However, scarce literature has assessed the impact of microbiological air quality on the occurrence of fungal infections in this population. We performed an integrative review of studies in the MEDLINE database that were published between January 1980 and October 2012, using the following combinations of keywords: air × quality × HEPA, air × quality × hematology, and airborne fungal infections. The search yielded only 13 articles, suggesting that high-efficiency filtering of the ambient air in hemato-oncology units can prevent the incidence of invasive fungal infections. However, no randomized clinical trial was found to confirm this suggestion. Currently, there is no consensus about the maximum allowable count of fungi in the air, which complicates filtration monitoring, including filter maintenance and replacement, and needs to be addressed in future studies.
Resumo:
In order to address and resolve the wastewater contamination problem of the Sines refinery with the main objective of optimizing the quality of this stream and reducing the costs charged to the refinery, a dynamic mass balance was developed nd implemented for ammonia and polar oil and grease (O&G) contamination in the wastewater circuit. The inadequate routing of sour gas from the sour water stripping unit and the kerosene caustic washing unit, were identified respectively as the major source of ammonia and polar substances present in the industrial wastewater effluent. For the O&G content, a predictive model was developed for the kerosene caustic washing unit, following the Projection to Latent Structures (PLS) approach. Comparison between analytical data for ammonia and polar O&G concentrations in refinery wastewater originating from the Dissolved Air Flotation (DAF) effluent and the model predictions of the dynamic mass balance calculations are in a very good agreement and highlights the dominant impact of the identified streams for the wastewater contamination levels. The ammonia contamination problem was solved by rerouting the sour gas through an existing clogged line with ammonia salts due to a non-insulated line section, while for the O&G a dynamic mass balance was implemented as an online tool, which allows for prevision of possible contamination situations and taking the required preventive actions, and can also serve as a basis for establishing relationships between the O&G contamination in the refinery wastewater with the properties of the refined crude oils and the process operating conditions. The PLS model developed could be of great asset in both optimizing the existing and designing new refinery wastewater treatment units or reuse schemes. In order to find a possible treatment solution for the spent caustic problem, an on-site pilot plant experiments for NaOH recovery from the refinery kerosene caustic washing unit effluent using an alkaline-resistant nanofiltration (NF) polymeric membrane were performed in order to evaluate its applicability for treating these highly alkaline and contaminated streams. For a constant operating pressure and temperature and adequate operating conditions, 99.9% of oil and grease rejection and 97.7% of chemical oxygen demand (COD) rejection were observed. No noticeable membrane fouling or flux decrease were registered until a volume concentration factor of 3. These results allow for NF permeate reuse instead of fresh caustic and for significant reduction of the wastewater contamination, which can result in savings of 1.5 M€ per year at the current prices for the largest Portuguese oil refinery. The capital investments needed for implementation of the required NF membrane system are less than 10% of those associated with the traditional wet air oxidation solution of the spent caustic problem. The operating costs are very similar, but can be less than half if reusing the NF concentrate in refinery pH control applications. The payback period was estimated to be 1.1 years. Overall, the pilot plant experimental results obtained and the process economic evaluation data indicate a very competitive solution through the proposed NF treatment process, which represents a highly promising alternative to conventional and existing spent caustic treatment units.
Resumo:
This paper presents a critical and quantitative analysis of the influence of the Power Quality in grid connected solar photovoltaic microgeneration installations. First are introduced the main regulations and legislation related with the solar photovoltaic microgeneration, in Portugal and Europe. Next are presented Power Quality monitoring results obtained from two residential solar photovoltaic installations located in the north of Portugal, and is explained how the Power Quality events affect the operation of these installations. Afterwards, it is described a methodology to estimate the energy production losses and the impact in the revenue caused by the abnormal operation of the electrical installation. This is done by comparing the amount of energy that was injected into the power grid with the theoretical value of energy that could be injected in normal conditions. The performed analysis shows that Power Quality severally affects the solar photovoltaic installations operation. The losses of revenue in the two monitored installations M1 and M2 are estimated in about 27% and 22%, respectively.
Resumo:
Within the civil engineering field, the use of the Finite Element Method has acquired a significant importance, since numerical simulations have been employed in a broad field, which encloses the design, analysis and prediction of the structural behaviour of constructions and infrastructures. Nevertheless, these mathematical simulations can only be useful if all the mechanical properties of the materials, boundary conditions and damages are properly modelled. Therefore, it is required not only experimental data (static and/or dynamic tests) to provide references parameters, but also robust calibration methods able to model damage or other special structural conditions. The present paper addresses the model calibration of a footbridge bridge tested with static loads and ambient vibrations. Damage assessment was also carried out based on a hybrid numerical procedure, which combines discrete damage functions with sets of piecewise linear damage functions. Results from the model calibration shows that the model reproduces with good accuracy the experimental behaviour of the bridge.
Resumo:
Biofilm research is growing more diverse and dependent on high-throughput technologies and the large-scale production of results aggravates data substantiation. In particular, it is often the case that experimental protocols are adapted to meet the needs of a particular laboratory and no statistical validation of the modified method is provided. This paper discusses the impact of intra-laboratory adaptation and non-rigorous documentation of experimental protocols on biofilm data interchange and validation. The case study is a non-standard, but widely used, workflow for Pseudomonas aeruginosa biofilm development, considering three analysis assays: the crystal violet (CV) assay for biomass quantification, the XTT assay for respiratory activity assessment, and the colony forming units (CFU) assay for determination of cell viability. The ruggedness of the protocol was assessed by introducing small changes in the biofilm growth conditions, which simulate minor protocol adaptations and non-rigorous protocol documentation. Results show that even minor variations in the biofilm growth conditions may affect the results considerably, and that the biofilm analysis assays lack repeatability. Intra-laboratory validation of non-standard protocols is found critical to ensure data quality and enable the comparison of results within and among laboratories.