910 resultados para click-and-use software
Resumo:
Objetivo: Compreender o conhecimento e o uso da voz por mulheres que cantam em coral e as repercussões para a promoção da saúde. Métodos: Realizou-se estudo qualitativo, de dezembro de 2011 a fevereiro de 2012, com 13 mulheres de 23 a 66 anos, membros de um coral de uma universidade, em Fortaleza, Ceará, Brasil. Coletaram-se os dados através de entrevista semiestruturada. Aplicou-se a análise temática para organizar os resultados em categorias, analisando-as à luz do interacionismo simbólico. Resultados: Identificaram-se dois núcleos de sentido: conhecimento sobre voz e uso da voz. As coralistas definiram a voz como meio de comunicação, identidade pessoal e forma para expressar emoções. Elas não demonstraram conhecimento consistente sobre os aspectos anatômicos e fisiológicos da voz, mas as definições apresentadas mostram que elas entendem que a voz permeia espaços pessoais, sociais e profissionais. A voz profissional e o envelhecimento destacaram-se no contexto do uso vocal. As participantes reconhecem que o conhecimento e o uso da voz podem ser aprimorados pelas atividades no coral, o que remete à promoção da saúde. Conclusão: As coralistas apresentam conhecimento limitado sobre a saúde vocal, porém, compreendem os efeitos benéficos do coral sobre sua saúde, ampliando a compreensão sobre a voz; isso estimula a adoção de hábitos saudáveis e de medidas preventivas, o que favorece o uso vocal.
Resumo:
High volume compost incorporation can reduce runoff from compacted soils but its use also associated with elevated N and P concentrations in runoff making it difficult to assess if this practice will reduce nutrient loading of surface waters. Additionally, little is known about how this practice will effect leguminous species establishment in lawns as means to reduce long term fertilizer use. When 5 cm of compost was incorporated into soil a reduction in runoff of 40 and 59% was needed for N and P losses from a tall fescue + microclover lawn to be equivalent to a non-compost amended soil supporting a well fertilized tall fescue lawn. Use of compost as a soil amendment resulted in quicker lawn establishment and darker color, when compared to non-amended soil receiving a mineral fertilizer. Biosolid composts containing high amounts ammonium severely reduce the establishment of clover in tall fescue + micrclover seed mixture.
Resumo:
Background: The present study was undertaken towards the development of SSR markers and assessing genetic relationships among 32 date palm ( Phoenix dactylifera L.) representing common cultivars grown in different geographical regions in Saudi Arabia. Results: Ninety-three novel simple sequence repeat markers were developed and screened for their ability to detect polymorphism in date palm. Around 71% of genomic SSRs were dinucleotide, 25% tri, 3% tetra and 1% penta nucleotide motives. Twenty-two primers generated a total of 91 alleles with a mean of 4.14 alleles per locus and 100% polymorphism percentage. A 0.595 average polymorphic information content and 0.662 primer discrimination power values were recorded. The expected and observed heterozygosities were 0.676 and 0.763 respectively. Pair-wise similarity values ranged from 0.06 to 0.89 and the overall cultivars averaged 0.41. The UPGMA cluster analysis recovered by principal coordinate analysis illustrated that cultivars tend to group according to their class of maturity, region of cultivation, and fruit color. Analysis of molecular variations (AMOVA) revealed that genetic variation among and within cultivars were 27% and 73%, respectively according to geographical distribution of cultivars. Conclusions: The developed microsatellite markers are additional values to date palm characterization tools that can be used by researchers in population genetics, cultivar identification as well as genetic resource exploration and management. The tested cultivars exhibited a significant amount of genetic diversity and could be suitable for successful breeding program. Genomic sequences generated from this study are available at the National Center for Biotechnology Information (NCBI), Sequence Read Archive (Accession numbers. LIBGSS_039019).
Resumo:
To conserve and protect the State's water resources the State of Maryland controls the appropriation or use of its surface waters and groundwater. State law requires all agricultural operations to comply with the water appropriation permitting process, including traditional forms of agriculture, livestock and poultry operations, nursery operations and aquaculture.
Resumo:
Objective: To investigate the knowledge and use of asthma control measurement (ACM) tools in the management of asthma among doctors working in family and internal medicine practice in Nigeria. Method: A questionnaire based on the global initiative on asthma (GINA) guideline was self-administered by 194 doctors. It contains 12 test items on knowledge of ACM tools and its application. The knowledge score was obtained by adding the correct answers and classified as good if the score ≥ 9, satisfactory if score was 6-8 and poor if < 6. Results: The overall doctors knowledge score of ACM tools was 4.49±2.14 (maximum of 12). Pulmonologists recorded the highest knowledge score of 10.75±1.85. The majority (69.6%) had poor knowledge score of ACM tools. Fifty (25.8%) assessed their patients’ level of asthma control and 34(17.5%) at every visit. Thirty-nine (20.1%) used ACM tools in their consultation, 29 (15.0%) of them used GINA defined control while 10 (5.2 %) used asthma control test (ACT). The use of the tools was associated with pulmonologists, having attended CME within six months and graduated within five years prior to the survey. Conclusion: The results highlight the poor knowledge and use of ACM tools and the need to address the knowledge gap.
Resumo:
Resumo:
Laser speckle contrast imaging (LSCI) has the potential to be a powerful tool in medicine, but more research in the field is required so it can be used properly. To help in the progression of Michigan Tech's research in the field, a graphical user interface (GUI) was designed in Matlab to control the instrumentation of the experiments as well as process the raw speckle images into contrast images while they are being acquired. The design of the system was successful and is currently being used by Michigan Tech's Biomedical Engineering department. This thesis describes the development of the LSCI GUI as well as offering a full introduction into the history, theory and applications of LSCI.
Resumo:
The work presented in this thesis has been part of a Cranfield University research project. This thesis aims to design a flight control law for large cargo aircraft by using predictive control, which can assure flight motion along the flight path exactly and on time. In particular this work involves the modelling of a Boeing C-17 Globemaster III 6DOF model (used as study case), by using DATCOM and Matlab Simulink software. Then a predictive control algorithm has been developed. The majority of the work is done in a Matlab/Simulink environment. Finally the predictive control algorithm has been applied on the aircraft model and its performances, in tracking given trajectory optimized through a 4DT Research Software, have been evaluated.
Resumo:
Security defects are common in large software systems because of their size and complexity. Although efficient development processes, testing, and maintenance policies are applied to software systems, there are still a large number of vulnerabilities that can remain, despite these measures. Some vulnerabilities stay in a system from one release to the next one because they cannot be easily reproduced through testing. These vulnerabilities endanger the security of the systems. We propose vulnerability classification and prediction frameworks based on vulnerability reproducibility. The frameworks are effective to identify the types and locations of vulnerabilities in the earlier stage, and improve the security of software in the next versions (referred to as releases). We expand an existing concept of software bug classification to vulnerability classification (easily reproducible and hard to reproduce) to develop a classification framework for differentiating between these vulnerabilities based on code fixes and textual reports. We then investigate the potential correlations between the vulnerability categories and the classical software metrics and some other runtime environmental factors of reproducibility to develop a vulnerability prediction framework. The classification and prediction frameworks help developers adopt corresponding mitigation or elimination actions and develop appropriate test cases. Also, the vulnerability prediction framework is of great help for security experts focus their effort on the top-ranked vulnerability-prone files. As a result, the frameworks decrease the number of attacks that exploit security vulnerabilities in the next versions of the software. To build the classification and prediction frameworks, different machine learning techniques (C4.5 Decision Tree, Random Forest, Logistic Regression, and Naive Bayes) are employed. The effectiveness of the proposed frameworks is assessed based on collected software security defects of Mozilla Firefox.
Resumo:
Performance measurement is seen as a main pillar of public management reforms. However, prior research did not mitigate the level of ambiguity on PMS development and use and many questions left unanswered. Our findings provide knowledge on the design and use of PMS in the public sector answer- ing to calls for a complementary approach among institutional and contingency factors. The authors find that the design of PMS and the use of performance information for external purposes are increased to legitimate the organizational’ work and to obtain external support. Moreover, structural variables such as type of agency and management autonomy are also determinants on the organizational responses. Finally, the authors find some coupling between the design of PMS and outcomes. So, both contingent and institutional approaches play an important role on the empirical model.
Resumo:
In order to turn more efficient the heating of class rooms in the lower floor of the old building of the University of Évora (a XVI century building), five drillings were organised inside the area of the university (Figure 1). The purpose was to use the temperature differential of groundwater in relation to air, by means of a heat exchanger, and use this process to heat the rooms using less energy, turning the heating process less expensive. The wells were drilled in fractured rocks (gneisses), and the purpose was to locate them at least around 100 m one from each other, whilst trying to have a hydraulic connection in-between. From the five initial wells, four were successful in terms of productivity, but just two of them (RA1 and RA2) proved to be hydraulically connected. The wells were equipped with screens for all their drilled depth (100 m), except for the first six meters and some two or three pipes of six meters each, to allow space for the installation for submersible pumps. The length of the installed screens guarantees a good efficiency of the system. In the wells with no connection, the heating system can work using each single well for abstraction and injection, but the process is much less efficient than in the cases where interaction between wells is possible through the rock’s fracture network.
Resumo:
The newly formed Escape and Evacuation Naval Authority regulates the provision of abandonment equipment and procedures for all Ministry of Defence Vessels. As such, it assures that access routes on board are evaluated early in the design process to maximize their efficiency and to eliminate, as far as possible, any congestion that might occur during escape. This analysis can be undertaken using a computer-based simulation for given escape scenarios and replicates the layout of the vessel and the interactions between each individual and the ship structure. One such software tool that facilitates this type of analysis is maritimeEXODUS. This tool, through large scale testing and validation, emulates human shipboard behaviour during emergency scenarios; however it is largely based around the behaviour of civilian passengers and fixtures and fittings of merchant vessels. Hence there existed a clear requirement to understand the behaviour of well-trained naval personnel as opposed to civilian passengers and be able to model the fixtures and fittings that are exclusive to warships, thus allowing improvements to both maritimeEXODUS and other software products. Human factor trials using the Royal Navy training facilities at Whale Island, Portsmouth were recently undertaken to collect data that improves our understanding of the aforementioned differences. It is hoped that this data will form the basis of a long-term improvement package that will provide global validation of these simulation tools and assist in the development of specific Escape and Evacuation standards for warships. © 2005: Royal Institution of Naval Architects.
Resumo:
A recent initiative of the European Space Agency (ESA) aims at the definition and adoption of a software reference architecture for use in on-board software of future space missions. Our PhD project placed in the context of that effort. At the outset of our work we gathered all the industrial needs relevant to ESA and all the main European space stakeholders and we were able to consolidate a set of technical high-level requirements for the fulfillment of them. The conclusion we reached from that phase confirmed that the adoption of a software reference architecture was indeed the best solution for the fulfillment of the high-level requirements. The software reference architecture we set on building rests on four constituents: (i) a component model, to design the software as a composition of individually verifiable and reusable software units; (ii) a computational model, to ensure that the architectural description of the software is statically analyzable; (iii) a programming model, to ensure that the implementation of the design entities conforms with the semantics, the assumptions and the constraints of the computational model; (iv) a conforming execution platform, to actively preserve at run time the properties asserted by static analysis. The nature, feasibility and fitness of constituents (ii), (iii) and (iv), were already proved by the author in an international project that preceded the commencement of the PhD work. The core of the PhD project was therefore centered on the design and prototype implementation of constituent (i), a component model. Our proposed component model is centered on: (i) rigorous separation of concerns, achieved with the support for design views and by careful allocation of concerns to the dedicated software entities; (ii) the support for specification and model-based analysis of extra-functional properties; (iii) the inclusion space-specific concerns.
Resumo:
Although the benefits of service orientation are prevalent in literature, a review, analysis, and evaluation of the 30 existing service analysis approaches presented in this paper have shown that a comprehensive approach to the identification and analysis of both business and supporting software services is missing. Based on this evaluation of existing approaches and additional sources, we close this gap by proposing an integrated, consolidated approach to business and software service analysis that combines and extends the strengths of the examined methodologies.