993 resultados para Code Quality
Resumo:
[ES]En la presente tesis se ha estudiado el impacto de diferentes fertilizantes y pesticidas utilizados en la Zona Vulnerable de Vitoria-Gasteiz en la calidad del suelo y las aguas de dicha zona. Se ha podido constatar que hoy en día siguen lixiviándose cantidades significativas de nitratos y pesticidas (e.g., etofumesato y difenoconazol) a las aguas de la Zona Vulnerable, durante el cultivo de remolacha azucarera (Beta vulgaris L.), muy característico de la zona de estudio. Se comprobó que el alto contenido en nitratos de las aguas subterráneas en la Zona Vulnerable es mitigado, al menos en parte, por la acción de la actividad microbiana desnitrificante que alberga la zona riparia del humedal de Salburua. Dicho proceso, sin embargo, supone la emisión a la atmósfera de importantes cantidades de gases de efecto invernadero (CO2 y N2O), y puede verse afectado negativamente por la presencia de pesticidas (e.g., deltametrina) en el medio.Por otra parte, hemos observado que diversos pesticidas (deltametrina, etofumesato, difenoconazol) aplicados en concentraciones similares a las dosis de aplicación en campo inducen cambios, de carácter limitado y transitorio, en las comunidades microbianas edáficas, siendo más significativos en el caso del fungicida difenoconazol. El efecto de los pesticidas fue más acusado a medida que aumentaba su concentración en el medio. Finalmente, encontramos que la aplicación de abonos orgánicos (avicompost), en lugar de los fertilizantes sintéticos tradicionales (NPK), además de mejorar la degradación de los pesticidas y disminuir el impacto de éstos sobre la calidad del suelo, podría ayudar a reducir las pérdidas de nitratos por lixiviación.
Resumo:
An object in the peripheral visual field is more difficult to recognize when surrounded by other objects. This phenomenon is called "crowding". Crowding places a fundamental constraint on human vision that limits performance on numerous tasks. It has been suggested that crowding results from spatial feature integration necessary for object recognition. However, in the absence of convincing models, this theory has remained controversial. Here, we present a quantitative and physiologically plausible model for spatial integration of orientation signals, based on the principles of population coding. Using simulations, we demonstrate that this model coherently accounts for fundamental properties of crowding, including critical spacing, "compulsory averaging", and a foveal-peripheral anisotropy. Moreover, we show that the model predicts increased responses to correlated visual stimuli. Altogether, these results suggest that crowding has little immediate bearing on object recognition but is a by-product of a general, elementary integration mechanism in early vision aimed at improving signal quality.
Resumo:
User supplied knowledge and interaction is a vital component of a toolkit for producing high quality parallel implementations of scalar FORTRAN numerical code. In this paper we consider the necessary components that such a parallelisation toolkit should possess to provide an effective environment to identify, extract and embed user relevant user knowledge. We also examine to what extent these facilities are available in leading parallelisation tools; in particular we discuss how these issues have been addressed in the development of the user interface of the Computer Aided Parallelisation Tools (CAPTools). The CAPTools environment has been designed to enable user exploration, interaction and insertion of user knowledge to facilitate the automatic generation of very efficient parallel code. A key issue in the user's interaction is control of the volume of information so that the user is focused on only that which is needed. User control over the level and extent of information revealed at any phase is supplied using a wide variety of filters. Another issue is the way in which information is communicated. Dependence analysis and its resulting graphs involve a lot of sophisticated rather abstract concepts unlikely to be familiar to most users of parallelising tools. As such, considerable effort has been made to communicate with the user in terms that they will understand. These features, amongst others, and their use in the parallelisation process are described and their effectiveness discussed.
Resumo:
This chapter discusses the code parallelization environment, where a number of tools that address the main tasks, such as code parallelization, debugging, and optimization are available. The parallelization tools include ParaWise and CAPO, which enable the near automatic parallelization of real world scientific application codes for shared and distributed memory-based parallel systems. The chapter discusses the use of ParaWise and CAPO to transform the original serial code into an equivalent parallel code that contains appropriate OpenMP directives. Additionally, as user involvement can introduce errors, a relative debugging tool (P2d2) is also available and can be used to perform near automatic relative debugging of an OpenMP program that has been parallelized either using the tools or manually. In order for these tools to be effective in parallelizing a range of applications, a high quality fully inter-procedural dependence analysis, as well as user interaction is vital to the generation of efficient parallel code and in the optimization of the backtracking and speculation process used in relative debugging. Results of parallelized NASA codes are discussed and show the benefits of using the environment.
Resumo:
Code parallelization using OpenMP for shared memory systems is relatively easier than using message passing for distributed memory systems. Despite this, it is still a challenge to use OpenMP to parallelize application codes in a way that yields effective scalable performance when executed on a shared memory parallel system. We describe an environment that will assist the programmer in the various tasks of code parallelization and this is achieved in a greatly reduced time frame and level of skill required. The parallelization environment includes a number of tools that address the main tasks of parallelism detection, OpenMP source code generation, debugging and optimization. These tools include a high quality, fully interprocedural dependence analysis with user interaction capabilities to facilitate the generation of efficient parallel code, an automatic relative debugging tool to identify erroneous user decisions in that interaction and also performance profiling to identify bottlenecks. Finally, experiences of parallelizing some NASA application codes are presented to illustrate some of the benefits of using the evolving environment.
Resumo:
There is a significant lack of indoor air quality research in low energy homes. This study compared the indoor air quality of eight
newly built case study homes constructed to similar levels of air-tightness and insulation; with two different ventilation strategies (four homes with Mechanical Ventilation with Heat Recovery (MVHR) systems/Code level 4 and four homes naturally ventilated/Code level 3). Indoor air quality measurements were conducted over a 24 h period in the living room and main bedroom of each home during the summer and winter seasons. Simultaneous outside measurements and an occupant diary were also employed during the measurement period. Occupant interviews were conducted to gain information on perceived indoor air quality, occupant behaviour and building related illnesses. Knowledge of the MVHR system including ventilation related behaviour was also studied. Results suggest indoor air quality problems in both the mechanically ventilated and naturally ventilated homes, with significant issues identified regarding occupant use in the social homes
Resumo:
CONSUMERS SENSORY EVALUATION OF MELON SWEETNESS AND QUALITY Agulheiro Santos, A.C, Rato, A.E., Laranjo, M. and Gonçalves, C. Departamento de Fitotecnia, Escola de Ciências e Tecnologia, Instituto de Ciências Agrárias e Ambientais Mediterrânicas (ICAAM), Instituto de Investigação e Formação Avançada (IIFA), Universidade de Évora, Polo da Mitra, Ap.94, 7002-554 Évora, Portugal. ABSTRACT The sensory quality of fruits is made of a range of attributes like sweetness, acidity, aroma, firmness, color. Taste perception and perception threshold of these attributes are variable according to the psychological and cultural development of individuals. To better understand the quality evaluation of melon by consumers, consumers were invited to taste melon samples, in supermarkets in Évora (South region), Lisbon (Central region) and Vila Nova de Gaia (North region). The present work explored the importance given by consumers to sweetness in order to classify the overall quality of melon. Furthermore, the relationship of the chemical evaluation of Total Soluble Solids (TSS) with sweetness of melon was studied. Fruits from the variety Melão branco picked randomly from those that were exposed for sale in supermarkets were used for analysis. Fruits were chinned along the equatorial zone and only the central part of the fruit, opposite to the part that leaned on the soil, was used to obtain homogeneous samples. Consumers were invited to taste four small pieces of each fruit, previously referenced with a code number, and answer a questionnaire with two questions related to sweetness and overall quality. Each question had five possible levels, identified from “Nothing sweet”, to “Extremely sweet”, in one case, and from “Poor” to “Excellent” in the other. Simultaneously, the values of TSS (measured in ºBrix) for each melon used in the study were evaluated by refractometry. This sensory analysis allowed us to point out the following findings: first of all, there is good agreement between the results obtained to classify “Sweetness” and “Overall Quality” (Cohen’s Kappa=53.1%, p<0.001), which means, for example, that fruits with excellent quality are in general extremely sweet. Moreover, fruits with less than 9.6 °Brix are considered of poor quality and nothing sweet, whereas fruits with values between 10 °Brix and 12 °Brix are considered good in terms of overall quality. It seems that the thresholds for the stimulus/intensity of sweetness lied between 10 °Brix to 14 °Brix for this melon variety. Acknowledgments This work was support by national funds through Fundação para a Ciência e a Tecnologia (FCT) under the Strategic Project Pest-OE/AGR/UI0115/2014 and co-funded by FEDER funds through the COMPETE Program.
Resumo:
Au Québec, la Loi sur les services de santé et les services sociaux, Chapitre S-4.2, à son article 233, demande à ce que chacun des établissements de santé, dispose d’un code d’éthique qui essentiellement demande de préciser les droits des usagers et de fixer les conduites attendues du personnel. Le législateur souhaitait améliorer les conduites du personnel dès le début des années 1990 et envisageait désigner un organisme de surveillance pour s’en assurer. Cette contrainte ne fut pas retenue et 20 ans plus tard, la volonté d’assurer des conduites attendues n’est toujours pas assujettie de contraintes ou de contrôles même si elle est toujours souhaitée. En 2003 toutefois, le Ministre a mis en place un processus de visites ministérielles dans les milieux d’hébergement et à ce jour quelques 150 établissements ont été visités. Ces équipes se sont préoccupées entre autre de la fonction du code d’éthique pour soutenir les directions de ces établissements. Elles ne réussissent pas à pouvoir s’appuyer sur le code d’éthique pour qu’il soit l’assise pour baser les décisions cliniques, organisationnelles et de gestion de chacune des organisations du réseau de la santé et des services sociaux du Québec. Il faut à ce moment-ci faire le constat que le code d’éthique, obligatoire, figure au nombre des nombreuses contraintes rencontrées par les organisations. Les établissements doivent passer un processus d’agrément aux trois ans et le code d’éthique n’est pas davantage un élément dynamique retenu à ce processus de validation de normes de qualité. De plus, une revue québécoise spécialisée en gestion de la santé a consacré un numéro complet de 15 articles sur « éthique et comportements » et le code d’éthique y est absent sauf pour deux articles qui s’y attardent spécifiquement. Est-ce une question d’éthique dont il est question par ce code, ou si ce n’est pas davantage de la déontologie, d’autant que le législateur veut avant tout s’assurer de comportements adéquats de la part des employés et des autres personnes qui exercent leur profession. Est-ce qu’un code de conduite ne serait pas plus approprié pour atteindre les fins visées? Cette question est répondue dans ce mémoire qui regarde les concepts d’éthique, de déontologie, de codes, de régulation des comportements. De plus, des analyses détaillées de 35 codes d’éthique actuels de divers établissements et de diverses régions du Québec iv sont présentées. La littérature nous donne les conditions de réussite pour un code et outre l’importance à accorder aux valeurs énoncées dans l’organisation, il est également question des sanctions à prévoir au non-respect de ces valeurs. Elles se doivent d’être claires et appliquées. Enfin, beaucoup d’organisations parlent maintenant de code de conduite et ce terme serait tout à fait approprié pour rejoindre le souhait du législateur qui veut assurer des conduites irréprochables des employés et autres personnes qui y travaillent. C’est la conclusion de ce travail, énoncée sous forme de recommandation.
Resumo:
Shrimp Aquaculture has provided tremendous opportunity for the economic and social upliftment of rural communities in the coastal areas of our country Over a hundred thousand farmers, of whom about 90% belong to the small and marginal category, are engaged in shrimp farming. Penaeus monodon is the most predominant cultured species in India which is mainly exported to highly sophisticated, quality and safety conscious world markets. Food safety has been of concem to humankind since the dawn of history and the concern about food safety resulted in the evolution of a cost effective, food safety assurance method, the Hazard Analysis Critical Control Point (HACCP). Considering the major contribution of cultured Penaeus monodon to the total shrimp production and the economic losses encountered due to disease outbreak and also because traditional methods of quality control and end point inspection cannot guarantee the safety of our cultured seafood products, it is essential that science based preventive approaches like HACCP and Pre requisite Programmes (PRP) be implemented in our shrimp farming operations. PRP is considered as a support system which provides a solid foundation for HACCP. The safety of postlarvae (PL) supplied for brackish water shrimp farming has also become an issue of concern over the past few years. The quality and safety of hatchery produced seeds have been deteriorating and disease outbreaks have become very common in hatcheries. It is in this context that the necessity for following strict quarantine measures with standards and code of practices becomes significant. Though there were a lot of hue and cry on the need for extending the focus of seafood safety assurance from processing and exporting to the pre-harvest and hatchery rearing phases, an experimental move in this direction has been rare or nil. An integrated management system only can assure the effective control of the quality, hygiene and safety related issues. This study therefore aims at designing a safety and quality management system model for implementation in shrimp farming and hatchery operations by linking the concepts of HACCP and PRP.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
Objective: To investigate the sociodemographic determinants of diet quality of the elderly in four EU countries. Design: Cross-sectional study. For each country, a regression was performed of a multidimensional index of dietary quality v. sociodemographic variables. Setting In Finland, Finnish Household Budget Survey (1998 and 2006); in Sweden, SNAC-K (2001–2004); in the UK, Expenditure & Food Survey (2006–07); in Italy, Multi-purpose Survey of Daily Life (2009). Subjects: One- and two-person households of over-50s (Finland, n 2994; UK, n 4749); over-50 s living alone or in two-person households (Italy, n 7564); over-60 s (Sweden, n 2023). Results: Diet quality among the EU elderly is both low on average and heterogeneous across individuals. The regression models explained a small but significant part of the observed heterogeneity in diet quality. Resource availability was associated with diet quality either negatively (Finland and UK) or in a non-linear or non-statistically significant manner (Italy and Sweden), as was the preference for food parameter. Education, not living alone and female gender were characteristics positively associated with diet quality with consistency across the four countries, unlike socio-professional status, age and seasonality. Regional differences within countries persisted even after controlling for the other sociodemographic variables. Conclusions: Poor dietary choices among the EU elderly were not caused by insufficient resources and informational measures could be successful in promoting healthy eating for healthy ageing. On the other hand, food habits appeared largely set in the latter part of life, with age and retirement having little influence on the healthiness of dietary choices.
Resumo:
A manufactured aeration and nanofiltration MBR greywater system was tested during continuous operation at the University of Reading, to demonstrate reliability in delivery of high quality treated greywater. Its treatment performance was evaluated against British Standard criteria [BSI (Greywater Systems—Part 1 Code of Practice: BS8525-1:2010. BS Press, 2010); (Greywater Systems—Part 2 Domestic Greywater Treatment, Requirements and Methods: BS 8525-2:2011. BS Press, 2011)]. The low carbon greywater recycling technology produced excellent analytical results as well as consistency in performance. User acceptance of such reliably treated greywater was then evaluated through user perception studies. The results inform the potential supply of treated greywater to student accommodation. Out of 135 questionnaire replies, 95% demonstrated a lack of aversion in one or more attributes, to using treated, recycled greywater.
Resumo:
Incluye Bibliografía
Resumo:
Máster Universitario en Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería (SIANI)
Resumo:
OBJECTIVES To compare longitudinal patterns of health care utilization and quality of care for other health conditions between breast cancer-surviving older women and a matched cohort without breast cancer. DESIGN Prospective five-year longitudinal comparison of cases and matched controls. SUBJECTS Newly identified breast cancer patients recruited during 1997–1999 from four geographic regions (Los Angeles, CA; Minnesota; North Carolina; and Rhode Island; N = 422) were matched by age, race, baseline comorbidity and zip code location with up to four non-breast-cancer controls (N = 1,656). OUTCOMES Survival; numbers of hospitalized days and physician visits; total inpatient and outpatient Medicare payments; guideline monitoring for patients with cardiovascular disease and diabetes, and bone density testing and colorectal cancer screening. RESULTS Five-year survival was similar for cases and controls (80% and 82%, respectively; p = 0.18). In the first follow-up year, comorbidity burden and health care utilization were higher for cases (p < 0.01), with most differences diminishing over time. However, the number of physician visits was higher for cases (p < 0.01) in every year, driven partly by more cancer and surgical specialist visits. Cases and controls adhered similarly to recommended bone density testing, and monitoring of cardiovascular disease and diabetes; adherence to recommended colorectal cancer screening was better among cases. CONCLUSION Breast cancer survivors’ health care utilization and disease burden return to pre-diagnosis levels after one year, yet their greater use of outpatient care persists at least five years. Quality of care for other chronic health problems is similar for cases and controls.