931 resultados para use of technology
Resumo:
The ability of gonadotrophin releasing hormone (GnRH) agonist implants to suppress ovarian activity and prevent pregnancies, long-term, was examined in heifers and cows maintained under extensive management. At three cattle stations, heifers (2-year-old) and older cows (3- to 16-year-old) were assigned to a control group that received no treatment, or were treated with high-dose (12 mg, Station A) or low-dose (8 mg, Station B and Station Q GnRH agonist implants. The respective numbers of control and GnRH agonist-treated animals (heifers + cows) at each station were: Station A, 20 and 99; Station B, 19 and 89; Station C, 20 and 76. Animals were maintained with 4% bulls and monitored for pregnancy at 2-monthly intervals for approximately 12 months. Pregnancy rates for control heifers and control cows ranged from 60-90% and 80-100%, respectively, depending on the study site. The respective number of animals (heifers + cows) treated with GnRH agonist that conceived, and days to first conception, were: Station A, 9 (9%) and 336 3 days; Station B, 8 (10%) and 244 +/- 13 days; Station C, 20 (26%) and 231 +/- 3 days. Treatment with high-dose GnRH agonist prevented pregnancies for longer (similar to300 days) than treatment with low-dose GnRH agonist (similar to200 days). In the majority of heifers and cows treated with GnRH agonist, ovarian follicular growth was restricted to early antral follicles (2-4 mm). The findings indicate that GnRH agonist implants have considerable potential as a practical technology to suppress ovarian activity and control reproduction in female cattle maintained in extensive rangelands environments. The technology also has broader applications in diverse cattle production systems. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Many public organisations have been under great pressure in recent years to increase the efficiency and transparency of outputs, to rationalise the use of public resources, and to increase the quality of service delivery. In this context, public organisations were encouraged to introduce the New Public Management reforms with the goal of improving the efficiency and effectiveness of the performance organisation through a new public management model. This new public management model is based on measurement by outputs and outcomes, a clear definition of responsibilities, the transparency and accountability of governmental activities, and on a greater value for citizens. What type of performance measurement systems are used in police services? Based on the literature, we see that multidimensional models, such as the Balanced Scorecard, are important in many public organisations, like municipalities, universities, and hospitals. Police services are characterised by complex, diverse objectives and stakeholders. Therefore, performance measurement of these public services calls for a specific analysis. Based on a nationwide survey of all police chiefs of the Portuguese police force, we find that employee performance measurement is the main form of measurement. Also, we propose a strategic map for the Portuguese police service.
Resumo:
With the purpose of at lowering costs and reendering the demanded information available to users with no access to the internet, service companies have adopted automated interaction technologies in their call centers, which may or may not meet the expectations of users. Based on different areas of knowledge (man-machine interaction, consumer behavior and use of IT) 13 propositions are raised and a research is carried out in three parts: focus group, field study with users and interviews with experts. Eleven automated service characteristics which support the explanation for user satisfaction are listed, a preferences model is proposed and evidence in favor or against each of the 13 propositions is brought in. With balance scorecard concepts, a managerial assessment model is proposed for the use of automated call center technology. In future works, the propositions may become verifiable hypotheses through conclusive empirical research.
Resumo:
An increasing amount of research is being developed in the area where technology and humans meet. The success or failure of technologies and the question whether technology helps humans to fulfill their goals or whether it hinders them is in most cases not a technical one. User Perception and Influencing Factors of Technology in Everyday Life addresses issues of human and technology interaction. The research in this work is interdisciplinary, ranging from more technical subjects such as computer science, engineering, and information systems, to non-technical descriptions of technology and human interaction from the point of view of sociology or philosophy. This book is perfect for academics, researchers, and professionals alike as it presents a set of theories that allow us to understand the interaction of technology and humans and to put it to practical use.
Resumo:
Glucose sensing is an issue with great interest in medical and biological applications. One possible approach to glucose detection takes advantage of measuring changes in fluorescence resonance energy transfer (FRET) between a fluorescent donor and an acceptor within a protein which undergoes glucose-induced changes in conformation. This demands the detection of fluorescent signals in the visible spectrum. In this paper we analyzed the emission spectrum obtained from fluorescent labels attached to a protein which changes its conformation in the presence of glucose using a commercial spectrofluorometer. Different glucose nanosensors were used to measure the output spectra with fluorescent signals located at the cyan and yellow bands of the spectrum. A new device is presented based on multilayered a-SiC:H heterostructures to detect identical transient visible signals. The transducer consists of a p-i'(a-SiC:H)-n/p-i(a-Si:H)-n heterostructure optimized for the detection of the fluorescence resonance energy transfer between fluorophores with excitation in the violet (400 nm) and emissions in the cyan (470 nm) and yellow (588 nm) range of the spectrum. Results show that the device photocurrent signal measured under reverse bias and using appropriate steady state optical bias, allows the separate detection of the cyan and yellow fluorescence signals presented.
Resumo:
This project was developed to fully assess the indoor air quality in archives and libraries from a fungal flora point of view. It uses classical methodologies such as traditional culture media – for the viable fungi – and modern molecular biology protocols, especially relevant to assess the non-viable fraction of the biological contaminants. Denaturing high-performance liquid chromatography (DHPLC) has emerged as an alternative to denaturing gradient gel electrophoresis (DGGE) and has already been applied to the study of a few bacterial communities. We propose the application of DHPLC to the study of fungal colonization on paper-based archive materials. This technology allows for the identification of each component of a mixture of fungi based on their genetic variation. In a highly complex mixture of microbial DNA this method can be used simply to study the population dynamics, and it also allows for sample fraction collection, which can, in many cases, be immediately sequenced, circumventing the need for cloning. Some examples of the methodological application are shown. Also applied is fragment length analysis for the study of mixed Candida samples. Both of these methods can later be applied in various fields, such as clinical and sand sample analysis. So far, the environmental analyses have been extremely useful to determine potentially pathogenic/toxinogenic fungi such as Stachybotrys sp., Aspergillus niger, Aspergillus fumigatus, and Fusarium sp. This work will hopefully lead to more accurate evaluation of environmental conditions for both human health and the preservation of documents.
Resumo:
Tese de doutoramento em Ciências da Educação
Resumo:
Dissertação apresentada como requisito parcial para obtenção do grau de Mestre em Estatística e Gestão de Informação
Resumo:
This work presents research conducted to understand the role of indicators in decisions of technology innovation. A gap was detected in the literature of innovation and technology assessment about the use and influence of indicators in this type of decision. It was important to address this gap because indicators are often frequent elements of innovation and technology assessment studies. The research was designed to determine the extent of the use and influence of indicators in decisions of technology innovation, to characterize the role of indicators in these decisions, and to understand how indicators are used in these decisions. The latter involved the test of four possible explanatory factors: the type and phase of decision, and the context and process of construction of evidence. Furthermore, it focused on three Portuguese innovation groups: public researchers, business R&D&I leaders and policymakers. The research used a combination of methods to collect quantitative and qualitative information, such as surveys, case studies and social network analysis. This research concluded that the use of indicators is different from their influence in decisions of technology innovation. In fact, there is a high use of indicators in these decisions, but lower and differentiated differences in their influence in each innovation group. This suggests that political-behavioural methods are also involved in the decisions to different degrees. The main social influences in the decisions came mostly from hierarchies, knowledge-based contacts and users. Furthermore, the research established that indicators played mostly symbolic roles in decisions of policymakers and business R&D&I leaders, although their role with researchers was more differentiated. Indicators were also described as helpful instruments to conduct a reasonable interpretation of data and to balance options in innovation and technology assessments studies, in particular when contextualised, described in detail and with discussion upon the options made. Results suggest that there are four main explanatory factors for the role of indicators in these decisions: First, the type of decision appears to be a factor to consider when explaining the role of indicators. In fact, each type of decision had different influences on the way indicators are used, and each type of decision used different types of indicators. Results for policy-making were particularly different from decisions of acquisition and development of products/technology. Second, the phase of the decision can help to understand the role indicators play in these decisions. Results distinguished between two phases detected in all decisions – before and after the decision – as well as two other phases that can be used to complement the decision process and where indicators can be involved. Third, the context of decision is an important factor to consider when explaining the way indicators are taken into consideration in policy decisions. In fact, the role of indicators can be influenced by the particular context of the decision maker, in which all types of evidence can be selected or downplayed. More importantly, the use of persuasive analytical evidence appears to be related with the dispute existent in the policy context. Fourth and last, the process of construction of evidence is a factor to consider when explaining the way indicators are involved in these decisions. In fact, indicators and other evidence were brought to the decision processes according to their availability and capacity to support the different arguments and interests of the actors and stakeholders. In one case, an indicator lost much persuasion strength with the controversies that it went through during the decision process. Therefore, it can be argued that the use of indicators is high but not very influential; their role is mostly symbolic to policymakers and business decisions, but varies among researchers. The role of indicators in these decisions depends on the type and phase of the decision and the context and process of construction of evidence. The latter two are related to the particular context of each decision maker, the existence of elements of dispute and controversies that influence the way indicators are introduced in the decision-making process.
Resumo:
Odour nuisance in other European countries has led to the development of techniques which employ panels of human assessors for the determination of environmental odours. Odour measurement is not widely practised in Ireland, yet local authorities are frequently in receipt of odour derived public complaints. This dissertation examines the fundamentals of odour nuisance in terms of how we perceive odours, common sources of environmental odours, the principles of odour measurement (in particular the Sutch pre-standard on olfactometry) and the extent to which odour nuisance is a problem in Ireland. The intention is to provide a reference document for use by those interested parties in the country who may be variously involved in policy making, legislative development, enforcement of environmental law or any person who has an interest in odours and the public nuisance they can give rise to. In particular the aim was to provide previously undocumented information on the prevalence of odour nuisance in Ireland, the exercision of the available powers to control odours, and the possible value of odour measurement as part of a regulatory process. A questionnaire was circulated to all local authorities in the country and 82% responded with information on their experiences and views on the subject of odours. The results of the survey are presented in summary and detailed form.
Resumo:
This study utilised recent developments in forensic aromatic hydrocarbon fingerprint analysis to characterise and identify specific biogenic, pyrogenic and petrogenic contamination. The fingerprinting and data interpretation techniques discussed include the recognition of: The distribution patterns of hydrocarbons (alkylated naphthalene, phenanthrene, dibenzothiophene, fluorene, chrysene and phenol isomers), • Analysis of “source-specific marker” compounds (individual saturated hydrocarbons, including n-alkanes (n-C5 through 0-C40) • Selected benzene, toluene, ethylbenzene and xylene isomers (BTEX), • The recalcitrant isoprenoids; pristane and phytane and • The determination of diagnostic ratios of specific petroleum / non-petroleum constituents, and the application of various statistical and numerical analysis tools. An unknown sample from the Irish Environmental Protection Agency (EPA) for origin characterisation was subjected to analysis by gas chromatography utilising both flame ionisation and mass spectral detection techniques in comparison to known reference materials. The percentage of the individual Polycyclic Aromatic Hydrocarbons (PAIIs) and biomarker concentrations in the unknown sample were normalised to the sum of the analytes and the results were compared with the corresponding results with a range of reference materials. In addition, to the determination of conventional diagnostic PAH and biomarker ratios, a number of “source-specific markers” isomeric PAHs within the same alkylation levels were determined, and their relative abundance ratios were computed in order to definitively identify and differentiate the various sources. Statistical logarithmic star plots were generated from both sets of data to give a pictorial representation of the comparison between the unknown sample and reference products. The study successfully characterised the unknown sample as being contaminated with a “coal tar” and clearly demonstrates the future role of compound ratio analysis (CORAT) in the identification of possible source contaminants.
Resumo:
This study analyses the area of construction and demolition waste (C & D W) auditing. The production of C&DW has grown year after year since the Environmental Protection Agency (EPA) first published a report in 1996 which provided data for C&D W quantities for 1995 (EPA, 1996a). The most recent report produced by the EPA is based on data for 2005 (EPA, 2006). This report estimated that the quantity of C&DW produced for that period to be 14 931 486 tonnes. However, this is a ‘data update’ report containing an update on certain waste statistics so any total provided would not be a true reflection of the waste produced for that period. This illustrates that a more construction site-specific form of data is required. The Department of Building and Civil Engineering in the Galway-Mayo Institute of Technology have carried out two recent research projects (Grimes, 2005; Kelly, 2006) in this area, which have produced waste production indicators based on site-specific data. This involved the design and testing of an original auditing tool based on visual characterisation and the application of conversion factors. One of the main recommendations of these studies was to compare this visual characterisation approach with a photogrammetric sorting methodology. This study investigates the application of photogrammetric sorting on a residential construction site in the Galway region. A visual characterisation study is also carried out on the same project to compare the two methodologies and assess the practical application in a construction site environment. Data collected from the waste management contractor on site was also used to provide further evaluation. From this, a set of waste production indicators for new residential construction was produced: □ 50.8 kg/m2 for new residential construction using data provided by the visual characterisation method and the Landfill Levy conversion factors. □ 43 kg/m2 for new residential construction using data provided by the photogrammetric sorting method and the Landfill Levy conversion factors. □ 23.8 kg/m2 for new residential construction using data provided by Waste Management Contractor (WMC). The acquisition of the data from the waste management contractor was a key element for testing of the information produced by the visual characterisation and photogrammetric sorting methods. The actual weight provided by the waste management contractor shows a significant difference between the quantities provided.
Resumo:
Similar immunizations of mice and hybridoma technology were used by several investigators to raise monoclonal antibodies which identified a limited range of epitopes and antigenic molecules. Further studies would have the scope for revealing yet more novel structures. The existing MABs are agreed standard reagents, avaiable to investigators and valuable for several applications. At least six epitopes specific for M. leprae were defined in molecular terms. Monoclonal antibody based immunoassays proved to be invaluable for the screening of recombinant DNA clones and for the topographic study of individual epitopes. Purification of antigens using affinity chromatography requires further development of techniques whilst serology of leprosy is open for clinical and epidemiological evaluation.
Resumo:
Point-of-care (POC) tests offer potentially substantial benefits for the management of infectious diseases, mainly by shortening the time to result and by making the test available at the bedside or at remote care centres. Commercial POC tests are already widely available for the diagnosis of bacterial and viral infections and for parasitic diseases, including malaria. Infectious diseases specialists and clinical microbiologists should be aware of the indications and limitations of each rapid test, so that they can use them appropriately and correctly interpret their results. The clinical applications and performance of the most relevant and commonly used POC tests are reviewed. Some of these tests exhibit insufficient sensitivity, and should therefore be coupled to confirmatory tests when the results are negative (e.g. Streptococcus pyogenes rapid antigen detection test), whereas the results of others need to be confirmed when positive (e.g. malaria). New molecular-based tests exhibit better sensitivity and specificity than former immunochromatographic assays (e.g. Streptococcus agalactiae detection). In the coming years, further evolution of POC tests may lead to new diagnostic approaches, such as panel testing, targeting not just a single pathogen, but all possible agents suspected in a specific clinical setting. To reach this goal, the development of serology-based and/or molecular-based microarrays/multiplexed tests will be needed. The availability of modern technology and new microfluidic devices will provide clinical microbiologists with the opportunity to be back at the bedside, proposing a large variety of POC tests that will allow quicker diagnosis and improved patient care.
Resumo:
The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensicscience denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstratedits potential to distinguish chemically identical compounds coming from different sources. Despite thenumerous applications of IRMS to a wide range of forensic materials, its implementation in a forensicframework is less straightforward than it appears. In addition, each laboratory has developed its ownstrategy of analysis on calibration, sequence design, standards utilisation and data treatment without aclear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose amethodological framework of the whole process using IRMS methods. We emphasize the importance ofconsidering isotopic results as part of a whole approach, when applying this technology to a particularforensic issue. The process is divided into six different steps, which should be considered for a thoughtfuland relevant application. The dissection of this process into fundamental steps, further detailed, enablesa better understanding of the essential, though not exhaustive, factors that have to be considered in orderto obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratorycomparisons.