943 resultados para Quality models
Resumo:
Distribution models are used increasingly for species conservation assessments over extensive areas, but the spatial resolution of the modeled data and, consequently, of the predictions generated directly from these models are usually too coarse for local conservation applications. Comprehensive distribution data at finer spatial resolution, however, require a level of sampling that is impractical for most species and regions. Models can be downscaled to predict distribution at finer resolutions, but this increases uncertainty because the predictive ability of models is not necessarily consistent beyond their original scale. We analyzed the performance of downscaled, previously published models of environmental favorability (a generalized linear modeling technique) for a restricted endemic insectivore, the Iberian desman (Galemys pyrenaicus), and a more widespread carnivore, the Eurasian otter ( Lutra lutra), in the Iberian Peninsula. The models, built from presence–absence data at 10 × 10 km resolution, were extrapolated to a resolution 100 times finer (1 × 1 km). We compared downscaled predictions of environmental quality for the two species with published data on local observations and on important conservation sites proposed by experts. Predictions were significantly related to observed presence or absence of species and to expert selection of sampling sites and important conservation sites. Our results suggest the potential usefulness of downscaled projections of environmental quality as a proxy for expensive and time-consuming field studies when the field studies are not feasible. This method may be valid for other similar species if coarse-resolution distribution data are available to define high-quality areas at a scale that is practical for the application of concrete conservation measures
Resumo:
The relationship between the themes of Total Quality Management (TQM) and Social Responsibility (CSR) through the concepts, approaches and models of excellence is a reality of sustainable and stable companies. Being organizations, people, act correctly and rightly do in society go through a quality management and social responsibility thereof. It is based on these two philosophies (Total Quality Management and Corporate Social Responsibility), which developed this literature review work, essentially based on a relational analysis in two papers, namely: "TQM and CSR Nexus" by Ghobadian et al. (2007) and "The Corporate Social Responsibility Audit Within the Quality Management Framework," de Kok et al. (2001) and applied to an organizational situation in concrete: the Nabeiro Delta Cafés Group - SGPS, SA.
Resumo:
The soil carries out a wide range of functions and it is important study the effects of land use on soil quality in order to provide most sustainable practices. Three fields trial have been considered to assess soil quality and functionality after human alteration, and to determine the power of soil enzymatic activities, biochemical indexes and mathematical model in the evaluation of soil status. The first field was characterized by conventional and organic management in which were tested also tillage effects. The second was characterized by conventional, organic and agro-ecological management. Finally, the third was a beech forest where was tested the effects of N deposition on soil organic carbon sequestration. Results highlight that both enzyme activities and biochemical indexes could be valid parameters for soil quality evaluation. Conventional management and plowing negatively affected soil quality and functionality with intensive tillage that lead to the downturn of microbial biomass and activity. Both organic and agro-ecological management revealed to be good practices for the maintenance of soil functionality with better microbial activity and metabolic efficiency. This positively affected also soil organic carbon content. At the eutrophic forest, enzyme activities and biochemical indexes positively respond to the treatments but one year of experimentation resulted to be not enough to observe variation in soil organic carbon content. Mathematical models and biochemical indicators resulted to be valid tools for assess soil quality, nonetheless it would be better including the microbial component in the mathematical model and consider more than one index if the aim of the work is to evaluate the overall soil quality and functionality. Concluding, the forest site is the richest one in terms of organic carbon, microbial biomass and activity while, the organic and the agro-ecological management seem to be the more sustainable but without taking in consideration the yield.
Resumo:
Inverse problems are at the core of many challenging applications. Variational and learning models provide estimated solutions of inverse problems as the outcome of specific reconstruction maps. In the variational approach, the result of the reconstruction map is the solution of a regularized minimization problem encoding information on the acquisition process and prior knowledge on the solution. In the learning approach, the reconstruction map is a parametric function whose parameters are identified by solving a minimization problem depending on a large set of data. In this thesis, we go beyond this apparent dichotomy between variational and learning models and we show they can be harmoniously merged in unified hybrid frameworks preserving their main advantages. We develop several highly efficient methods based on both these model-driven and data-driven strategies, for which we provide a detailed convergence analysis. The arising algorithms are applied to solve inverse problems involving images and time series. For each task, we show the proposed schemes improve the performances of many other existing methods in terms of both computational burden and quality of the solution. In the first part, we focus on gradient-based regularized variational models which are shown to be effective for segmentation purposes and thermal and medical image enhancement. We consider gradient sparsity-promoting regularized models for which we develop different strategies to estimate the regularization strength. Furthermore, we introduce a novel gradient-based Plug-and-Play convergent scheme considering a deep learning based denoiser trained on the gradient domain. In the second part, we address the tasks of natural image deblurring, image and video super resolution microscopy and positioning time series prediction, through deep learning based methods. We boost the performances of supervised, such as trained convolutional and recurrent networks, and unsupervised deep learning strategies, such as Deep Image Prior, by penalizing the losses with handcrafted regularization terms.
Resumo:
Air pollution is one of the greatest health risks in the world. At the same time, the strong correlation with climate change, as well as with Urban Heat Island and Heat Waves, make more intense the effects of all these phenomena. A good air quality and high levels of thermal comfort are the big goals to be reached in urban areas in coming years. Air quality forecast help decision makers to improve air quality and public health strategies, mitigating the occurrence of acute air pollution episodes. Air quality forecasting approaches combine an ensemble of models to provide forecasts from global to regional air pollution and downscaling for selected countries and regions. The development of models dedicated to urban air quality issues requires a good set of data regarding the urban morphology and building material characteristics. Only few examples of air quality forecast system at urban scale exist in the literature and often they are limited to selected cities. This thesis develops by setting up a methodology for the development of a forecasting tool. The forecasting tool can be adapted to all cities and uses a new parametrization for vegetated areas. The parametrization method, based on aerodynamic parameters, produce the urban spatially varying roughness. At the core of the forecasting tool there is a dispersion model (urban scale) used in forecasting mode, and the meteorological and background concentration forecasts provided by two regional numerical weather forecasting models. The tool produces the 1-day spatial forecast of NO2, PM10, O3 concentration, the air temperature, the air humidity and BLQ-Air index values. The tool is automatized to run every day, the maps produced are displayed on the e-Globus platform, updated every day. The results obtained indicate that the forecasting output were in good agreement with the observed measurements.
Resumo:
Bone disorders have severe impact on body functions and quality life, and no satisfying therapies exist yet. The current models for bone disease study are scarcely predictive and the options existing for therapy fail for complex systems. To mimic and/or restore bone, 3D printing/bioprinting allows the creation of 3D structures with different materials compositions, properties, and designs. In this study, 3D printing/bioprinting has been explored for (i) 3D in vitro tumor models and (ii) regenerative medicine. Tumor models have been developed by investigating different bioinks (i.e., alginate, modified gelatin) enriched by hydroxyapatite nanoparticles to increase printing fidelity and increase biomimicry level, thus mimicking the organic and inorganic phase of bone. High Saos-2 cell viability was obtained, and the promotion of spheroids clusters as occurring in vivo was observed. To develop new syntethic bone grafts, two approaches have been explored. In the first, novel magnesium-phosphate scaffolds have been investigated by extrusion-based 3D printing for spinal fusion. 3D printing process and parameters have been optimized to obtain custom-shaped structures, with competent mechanical properties. The 3D printed structures have been combined to alginate porous structures created by a novel ice-templating technique, to be loaded by antibiotic drug to address infection prevention. Promising results in terms of planktonic growth inhibition was obtained. In the second strategy, marine waste precursors have been considered for the conversion in biogenic HA by using a mild-wet conversion method with different parameters. The HA/carbonate ratio conversion efficacy was analysed for each precursor (by FTIR and SEM), and the best conditions were combined to alginate to develop a composite structure. The composite paste was successfully employed in custom-modified 3D printer for the obtainment of 3D printed stable scaffolds. In conclusion, the osteomimetic materials developed in this study for bone models and synthetic grafts are promising in bone field.
Resumo:
The design process of any electric vehicle system has to be oriented towards the best energy efficiency, together with the constraint of maintaining comfort in the vehicle cabin. Main aim of this study is to research the best thermal management solution in terms of HVAC efficiency without compromising occupant’s comfort and internal air quality. An Arduino controlled Low Cost System of Sensors was developed and compared against reference instrumentation (average R-squared of 0.92) and then used to characterise the vehicle cabin in real parking and driving conditions trials. Data on the energy use of the HVAC was retrieved from the car On-Board Diagnostic port. Energy savings using recirculation can reach 30 %, but pollutants concentration in the cabin builds up in this operating mode. Moreover, the temperature profile appeared strongly nonuniform with air temperature differences up to 10° C. Optimisation methods often require a high number of runs to find the optimal configuration of the system. Fast models proved to be beneficial for these task, while CFD-1D model are usually slower despite the higher level of detail provided. In this work, the collected dataset was used to train a fast ML model of both cabin and HVAC using linear regression. Average scaled RMSE over all trials is 0.4 %, while computation time is 0.0077 ms for each second of simulated time on a laptop computer. Finally, a reinforcement learning environment was built in OpenAI and Stable-Baselines3 using the built-in Proximal Policy Optimisation algorithm to update the policy and seek for the best compromise between comfort, air quality and energy reward terms. The learning curves show an oscillating behaviour overall, with only 2 experiments behaving as expected even if too slow. This result leaves large room for improvement, ranging from the reward function engineering to the expansion of the ML model.
Resumo:
Artificial Intelligence (AI) and Machine Learning (ML) are novel data analysis techniques providing very accurate prediction results. They are widely adopted in a variety of industries to improve efficiency and decision-making, but they are also being used to develop intelligent systems. Their success grounds upon complex mathematical models, whose decisions and rationale are usually difficult to comprehend for human users to the point of being dubbed as black-boxes. This is particularly relevant in sensitive and highly regulated domains. To mitigate and possibly solve this issue, the Explainable AI (XAI) field became prominent in recent years. XAI consists of models and techniques to enable understanding of the intricated patterns discovered by black-box models. In this thesis, we consider model-agnostic XAI techniques, which can be applied to Tabular data, with a particular focus on the Credit Scoring domain. Special attention is dedicated to the LIME framework, for which we propose several modifications to the vanilla algorithm, in particular: a pair of complementary Stability Indices that accurately measure LIME stability, and the OptiLIME policy which helps the practitioner finding the proper balance among explanations' stability and reliability. We subsequently put forward GLEAMS a model-agnostic surrogate interpretable model which requires to be trained only once, while providing both Local and Global explanations of the black-box model. GLEAMS produces feature attributions and what-if scenarios, from both dataset and model perspective. Eventually, we argue that synthetic data are an emerging trend in AI, being more and more used to train complex models instead of original data. To be able to explain the outcomes of such models, we must guarantee that synthetic data are reliable enough to be able to translate their explanations to real-world individuals. To this end we propose DAISYnt, a suite of tests to measure synthetic tabular data quality and privacy.
Resumo:
Natural Language Processing has always been one of the most popular topics in Artificial Intelligence. Argument-related research in NLP, such as argument detection, argument mining and argument generation, has been popular, especially in recent years. In our daily lives, we use arguments to express ourselves. The quality of arguments heavily impacts the effectiveness of our communications with others. In professional fields, such as legislation and academic areas, arguments of good quality play an even more critical role. Therefore, argument generation with good quality is a challenging research task that is also of great importance in NLP. The aim of this work is to investigate the automatic generation of arguments with good quality, according to the given topic, stance and aspect (control codes). To achieve this goal, a module based on BERT [17] which could judge an argument's quality is constructed. This module is used to assess the quality of the generated arguments. Another module based on GPT-2 [19] is implemented to generate arguments. Stances and aspects are also used as guidance when generating arguments. After combining all these models and techniques, the ranks of the generated arguments could be acquired to evaluate the final performance. This dissertation describes the architecture and experimental setup, analyzes the results of our experimentation, and discusses future directions.
Resumo:
The study of the tides of a celestial bodies can unveil important information about their interior as well as their orbital evolution. The most important tidal parameter is the Love number, which defines the deformation of the gravity field due to an external perturbing body. Tidal dissipation is very important because it drives the secular orbital evolution of the natural satellites, which is even more important in the case of the the Jupiter system, where three of the Galilean moons, Io, Europa and Ganymede, are locked in an orbital resonance where the ratio of their mean motions is 4:2:1. This is called Laplace resonance. Tidal dissipation is described by the dissipation ratio k2/Q, where Q is the quality factor and it describes the dampening of a system. The goal of this thesis is to analyze and compare the two main tidal dynamical models, Mignard's model and gravity field variation model, to understand the differences between each model with a main focus on the single-moon case with Io, which can help also understanding better the differences between the two models without over complicating the dynamical model. In this work we have verified and validated both models, we have compared them and pinpointed the main differences and features that characterize each model. Mignard's model treats the tides directly as a force, while the gravity field variation model describes the tides with a change of the spherical harmonic coefficients. Finally, we have also briefly analyzed the difference between the single-moon case and the two-moon case, and we have confirmed that the governing equations that describe the change of semi-major axis and eccentricity are not good anymore when more moons are present.
Resumo:
A miniaturised gas analyser is described and evaluated based on the use of a substrate-integrated hollow waveguide (iHWG) coupled to a microsized near-infrared spectrophotometer comprising a linear variable filter and an array of InGaAs detectors. This gas sensing system was applied to analyse surrogate samples of natural fuel gas containing methane, ethane, propane and butane, quantified by using multivariate regression models based on partial least square (PLS) algorithms and Savitzky-Golay 1(st) derivative data preprocessing. The external validation of the obtained models reveals root mean square errors of prediction of 0.37, 0.36, 0.67 and 0.37% (v/v), for methane, ethane, propane and butane, respectively. The developed sensing system provides particularly rapid response times upon composition changes of the gaseous sample (approximately 2 s) due the minute volume of the iHWG-based measurement cell. The sensing system developed in this study is fully portable with a hand-held sized analyser footprint, and thus ideally suited for field analysis. Last but not least, the obtained results corroborate the potential of NIR-iHWG analysers for monitoring the quality of natural gas and petrochemical gaseous products.
Resumo:
Prosopis rubriflora and Prosopis ruscifolia are important species in the Chaquenian regions of Brazil. Because of the restriction and frequency of their physiognomy, they are excellent models for conservation genetics studies. The use of microsatellite markers (Simple Sequence Repeats, SSRs) has become increasingly important in recent years and has proven to be a powerful tool for both ecological and molecular studies. In this study, we present the development and characterization of 10 new markers for P. rubriflora and 13 new markers for P. ruscifolia. The genotyping was performed using 40 P. rubriflora samples and 48 P. ruscifolia samples from the Chaquenian remnants in Brazil. The polymorphism information content (PIC) of the P. rubriflora markers ranged from 0.073 to 0.791, and no null alleles or deviation from Hardy-Weinberg equilibrium (HW) were detected. The PIC values for the P. ruscifolia markers ranged from 0.289 to 0.883, but a departure from HW and null alleles were detected for certain loci; however, this departure may have resulted from anthropic activities, such as the presence of livestock, which is very common in the remnant areas. In this study, we describe novel SSR polymorphic markers that may be helpful in future genetic studies of P. rubriflora and P. ruscifolia.
Resumo:
This study sought to evaluate the association between the impact of oral disorders in terms of physical/psychosocial dimensions and quality of life among the elderly. It involved a cross-sectional study conducted among the elderly (65-74 years) in 2008/2009. The social impact was assessed using the Oral Health Impact Profile (OHIP 14) and the quality of life using the SF 12 Short-Form Health Survey. Descriptive, univariate and multivariate (logistic regression) analysis was conducted with correction for the design effect, using SPSS(r)18.0 software. Of the 800 individuals approached, 736 elderly individuals participated (TR = 92%), with a mean age of 67.77 years, the majority of whom showed no impact based on the measurement of the prevalence of OHIP. The functional limitation dimension of the OHIP was associated with the physical domain of the SF12, irrespective of the other variables investigated. However, the seriousness of OHIP and its psychological discomfort and disability dimensions was associated with the mental domain of the SF12. The conclusion reached is that some impacts of oral disorders were associated with unsatisfactory quality of life in the physical and mental domains.
Resumo:
To assess quality of care of women with severe maternal morbidity and to identify associated factors. This is a national multicenter cross-sectional study performing surveillance for severe maternal morbidity, using the World Health Organization criteria. The expected number of maternal deaths was calculated with the maternal severity index (MSI) based on the severity of complication, and the standardized mortality ratio (SMR) for each center was estimated. Analyses on the adequacy of care were performed. 17 hospitals were classified as providing adequate and 10 as nonadequate care. Besides almost twofold increase in maternal mortality ratio, the main factors associated with nonadequate performance were geographic difficulty in accessing health services (P < 0.001), delays related to quality of medical care (P = 0.012), absence of blood derivatives (P = 0.013), difficulties of communication between health services (P = 0.004), and any delay during the whole process (P = 0.039). This is an example of how evaluation of the performance of health services is possible, using a benchmarking tool specific to Obstetrics. In this study the MSI was a useful tool for identifying differences in maternal mortality ratios and factors associated with nonadequate performance of care.
Resumo:
The aim of this study was to assess the quality of diet among the elderly and associations with socio-demographic variables, health-related behaviors, and diseases. A population-based cross-sectional study was conducted in a representative sample of 1,509 elderly participants in a health survey in Campinas, São Paulo State, Brazil. Food quality was assessed using the Revised Diet Quality Index (DQI-R). Mean index scores were estimated and a multiple regression model was employed for the adjusted analyses. The highest diet quality scores were associated with age 80 years or older, Evangelical religion, diabetes mellitus, and physical activity, while the lowest scores were associated with home environments shared with three or more people, smoking, and consumption of soft drinks and alcoholic beverages. The findings emphasize a general need for diet quality improvements in the elderly, specifically in subgroups with unhealthy behaviors, who should be targeted with comprehensive strategies.