910 resultados para click-and-use software
Resumo:
Workplace noise has become one of the major issues in industry not only because of workers’ health but also due to safety. Electric motors, particularly, inverter fed induction motors emit objectionably high levels of noise. This has led to the emergence of a research area, concerned with measurement and mitigation of the acoustic noise. This paper presents a lowcost option for measurement and spectral analysis of acoustic noise emitted by electric motors. The system consists of an electret microphone, amplifier and filter. It makes use of the windows sound card and associated software for data acquisition and analysis. The measurement system is calibrated using a professional sound level meter. Acoustic noise measurements are made on an induction motor drive using the proposed system as per relevant international standards. These measurements are seen to match closely with those of a professional meter.
Resumo:
Aboriginal peoples in Canada have been mapping aspects of their cultures for more than a generation. Indians, Inuit, Métis, non-status Indians and others have called their maps by different names at various times and places: land use and occupancy; land occupancy and use; traditional use; traditional land use and occupancy; current use; cultural sensitive areas; and so on. I use “land use and occupancy mapping” in a generic sense to include all the above. The term refers to the collection of interview data about traditional use of resources and occupancy of lands by First Nation persons, and the presentation of those data in map form. Think of it as the geography of oral tradition, or as the mapping of cultural and resource geography. (PDF contains 81 pages.)
Resumo:
Background Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA) approaches have emerged as an alternative to the traditional data dependent acquisition (DDA) in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. Results In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS) software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file) files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. Conclusions We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates integration. PAnalyzer is an easy to use multiplatform and free software tool.
Resumo:
The report introduces software sustainability, provides definitions, clearly demonstrates that software is not the same as data and illustrates aspects of sustainability in the software lifecycle. The recommendations state that improving software sustainability requires a number of changes: some technical and others societal, some small and others significant. We must start by raising awareness of researchers’ reliance on software. This goal will become easier if we recognise the valuable contribution that software makes to research – and reward those people who invest their time into developing reliable and reproducible software. The adoption of software has led to significant advances in research. But if we do not change our research practices, the continued rise in software use will be accompanied by a rise in retractions. Ultimately, anyone who is concerned about the reliability and reproducibility of research should be concerned about software sustainability. Beside highlighting the benefits of software sustainability and addressing the societal and technical barriers to software sustainability, the report provides access to expertise in software sustainability and outlines the role of funders. The report concludes with a short landscape of national activities in Europe and outside Europe. As a result of the workshop steps will be explored to establish European coordination and cooperation of national initiatives.
Resumo:
Diferentes organizações públicas e privadas coletam e disponibilizam uma massa de dados sobre a realidade sócio-econômica das diferentes nações. Há hoje, da parte do governo brasileiro, um interesse manifesto de divulgar uma gama diferenciada de informações para os mais diversos perfis de usuários. Persiste, contudo, uma série de limitações para uma divulgação mais massiva e democrática, entre elas, a heterogeneidade das fontes de dados, sua dispersão e formato de apresentação pouco amigável. Devido à complexidade inerente à informação geográfica envolvida, que produz incompatibilidade em vários níveis, o intercâmbio de dados em sistemas de informação geográfica não é problema trivial. Para aplicações desenvolvidas para a Web, uma solução são os Web Services que permitem que novas aplicações possam interagir com aquelas que já existem e que sistemas desenvolvidos em plataformas diferentes sejam compatíveis. Neste sentido, o objetivo do trabalho é mostrar as possibilidades de construção de portais usando software livre, a tecnologia dos Web Services e os padrões do Open Geospatial Consortium (OGC) para a disseminação de dados espaciais. Visando avaliar e testar as tecnologias selecionadas e comprovar sua efetividade foi desenvolvido um exemplo de portal de dados sócio-econômicos, compreendendo informações de um servidor local e de servidores remotos. As contribuições do trabalho são a disponibilização de mapas dinâmicos, a geração de mapas através da composição de mapas disponibilizados em servidores remotos e local e o uso do padrão OGC WMC. Analisando o protótipo de portal construído, verifica-se, contudo, que a localização e requisição de Web Services não são tarefas fáceis para um usuário típico da Internet. Nesta direção, os trabalhos futuros no domínio dos portais de informação geográfica poderiam adotar a tecnologia Representational State Transfer (REST).
Resumo:
The increase in harbor seal (Phoca vitulina richardsi) abundance, concurrent with the decrease in salmonid (Oncorhynchus spp.) and other fish stocks, raises concerns about the potential negative impact of seals on fish populations. Although harbor seals are found in rivers and estuaries, their presence is not necessarily indicative of exclusive or predominant feeding in these systems. We examined the diet of harbor seals in the Umpqua River, Oregon, during 1997 and 1998 to indirectly assess whether or not they were feeding in the river. Fish otoliths and other skeletal structures were recovered from 651 scats and used to identify seal prey. The use of all diagnostic prey structures, rather than just otoliths, increased our estimates of the number of taxa, the minimum number of individuals and percent frequency of occurrence (%FO) of prey consumed. The %FO indicated that the most common prey were pleuronectids, Pacific hake (Merluccius productus), Pacific stag-horn sculpin (Leptocottus armatus), osmerids, and shiner surfperch (Cymatogaster aggregata). The majority (76%) of prey were fish that inhabit marine waters exclusively and fish found in marine and estuarine areas (e.g. anadromous spp.) which would indicate that seals forage predominantly at sea and use the estuary for resting and opportunistic feeding. Salmonid remains were encountered in 39 samples (6%); two samples contained identifiable otoliths, which were determined to be from chi-nook salmon (O. tshawytscha). Because of the complex salmonid composition in the Umpqua River, we used molecular genetic techniques on salmonid bones retrieved from scat to discern species that were rare from those that were abundant. Of the 37 scats with salmonid bones but no otoliths, bones were identified genetically as chinook or coho (O. kisutch) salmon, or steelhead trout (O. mykiss) in 90% of the samples.
Resumo:
Information and Communication Technology (ICT) is becoming increasingly central to many people’s lives, making it possible to be connected in any place at any time, be unceasingly and instantly informed, and benefit from greater economic and educational opportunities. With all the benefits afforded by these new-found capabilities, however, come potential drawbacks. A plethora of new PCs, laptops, tablets, smartphones, Bluetooth, the internet, Wi-Fi (the list goes on) expect us to know or be able to guess, what, where and when to connect, click, double-click, tap, flick, scroll, in order to realise these benefits, and to have the physical and cognitive capability to do all these things. One of the groups most affected by this increase in high-demand technology is older people. They do not understand and use technology in the same way that younger generations do, because they grew up in the simpler electro-mechanical era and embedded that particular model of the world in their minds. Any consequential difficulty in familiarising themselves with modern ICT and effectively applying it to their needs can also be exacerbated by age-related changes in vision, motor control and cognitive functioning. Such challenges lead to digital exclusion. Much has been written about this topic over the years, usually by academics from the area of inclusive product design. The issue is complex and it is fair to say that no one researcher has the whole picture. It is difficult to understand and adequately address the issue of digital exclusion among the older generation without looking across disciplines and at industry’s and government’s understanding, motivation and efforts toward resolving this important problem. To do otherwise is to risk misunderstanding the true impact that ICT has and could have on people’s lives across all generations. In this European year of Active Ageing and Solidarity between Generations and as the British government is moving forward with its Digital by Default initiative as part of a wider objective to make ICT accessible to as many people as possible by 2015, the Engineering Design Centre (EDC) at the University of Cambridge collaborated with BT to produce a book of thought pieces to address, and where appropriate redress, these important and long-standing issues. “Ageing, Adaption and Accessibility: Time for the Inclusive Revolution!” brings together opinions and insights from twenty one prominent thought leaders from government, industry and academia regarding the problems, opportunities and strategies for combating digital exclusion among senior citizens. The contributing experts were selected as individuals, rather than representatives of organisations, to provide the broadest possible range of perspectives. They are renowned in their respective fields and their opinions are formed not only from their own work, but also from the contributions of others in their area. Their views were elicited through conversations conducted by the editors of this book who then drafted the thought pieces to be edited and approved by the experts. We hope that this unique collection of thought pieces will give you a broader perspective on ageing, people’s adaption to the ever changing world of technology and insights into better ways of designing digital devices and services for the older population.
Resumo:
The State Key Laboratory of Computer Science (SKLCS) is committed to basic research in computer science and software engineering. The research topics of the laboratory include: concurrency theory, theory and algorithms for real-time systems, formal specifications based on context-free grammars, semantics of programming languages, model checking, automated reasoning, logic programming, software testing, software process improvement, middleware technology, parallel algorithms and parallel software, computer graphics and human-computer interaction. This paper describes these topics in some detail and summarizes some results obtained in recent years.
Resumo:
Poster is based on the following paper: C. Kwan and M. Betke. Camera Canvas: Image editing software for people with disabilities. In Proceedings of the 14th International Conference on Human Computer Interaction (HCI International 2011), Orlando, Florida, July 2011.
Resumo:
The emergence of a sensor-networked world produces a clear and urgent need for well-planned, safe and secure software engineering. It is the role of universities to prepare graduates with the knowledge and experience to enter the work-force with a clear understanding of software design and its application to the future safety of computing. The snBench (Sensor Network WorkBench) project aims to provide support to the programming and deployment of Sensor Network Applications, enabling shared sensor embedded spaces to be easily tasked with various sensory applications by different users for simultaneous execution. In this report we discus our experience using the snBench research project as the foundation for semester-long project in a graduate level software engineering class at Boston University (CS511).
Resumo:
There has been an increased use of the Doubly-Fed Induction Machine (DFIM) in ac drive applications in recent times, particularly in the field of renewable energy systems and other high power variable-speed drives. The DFIM is widely regarded as the optimal generation system for both onshore and offshore wind turbines and has also been considered in wave power applications. Wind power generation is the most mature renewable technology. However, wave energy has attracted a large interest recently as the potential for power extraction is very significant. Various wave energy converter (WEC) technologies currently exist with the oscillating water column (OWC) type converter being one of the most advanced. There are fundemental differences in the power profile of the pneumatic power supplied by the OWC WEC and that of a wind turbine and this causes significant challenges in the selection and rating of electrical generators for the OWC devises. The thesis initially aims to provide an accurate per-phase equivalent circuit model of the DFIM by investigating various characterisation testing procedures. Novel testing methodologies based on the series-coupling tests is employed and is found to provide a more accurate representation of the DFIM than the standard IEEE testing methods because the series-coupling tests provide a direct method of determining the equivalent-circuit resistances and inductances of the machine. A second novel method known as the extended short-circuit test is also presented and investigated as an alternative characterisation method. Experimental results on a 1.1 kW DFIM and a 30 kW DFIM utilising the various characterisation procedures are presented in the thesis. The various test methods are analysed and validated through comparison of model predictions and torque-versus-speed curves for each induction machine. Sensitivity analysis is also used as a means of quantifying the effect of experimental error on the results taken from each of the testing procedures and is used to determine the suitability of the test procedures for characterising each of the devices. The series-coupling differential test is demonstrated to be the optimum test. The research then focuses on the OWC WEC and the modelling of this device. A software model is implemented based on data obtained from a scaled prototype device situated at the Irish test site. Test data from the electrical system of the device is analysed and this data is used to develop a performance curve for the air turbine utilised in the WEC. This performance curve was applied in a software model to represent the turbine in the electro-mechanical system and the software results are validated by the measured electrical output data from the prototype test device. Finally, once both the DFIM and OWC WEC power take-off system have been modeled succesfully, an investigation of the application of the DFIM to the OWC WEC model is carried out to determine the electrical machine rating required for the pulsating power derived from OWC WEC device. Thermal analysis of a 30 kW induction machine is carried out using a first-order thermal model. The simulations quantify the limits of operation of the machine and enable thedevelopment of rating requirements for the electrical generation system of the OWC WEC. The thesis can be considered to have three sections. The first section of the thesis contains Chapters 2 and 3 and focuses on the accurate characterisation of the doubly-fed induction machine using various testing procedures. The second section, containing Chapter 4, concentrates on the modelling of the OWC WEC power-takeoff with particular focus on the Wells turbine. Validation of this model is carried out through comparision of simulations and experimental measurements. The third section of the thesis utilises the OWC WEC model from Chapter 4 with a 30 kW induction machine model to determine the optimum device rating for the specified machine. Simulations are carried out to perform thermal analysis of the machine to give a general insight into electrical machine rating for an OWC WEC device.
Resumo:
The concept of pellicular particles was suggested by Horváth and Lipsky over fifty years ago. The reasoning behind the idea of these particles was to improve column efficiency by shortening the pathways analyte molecules can travel, therefore reducing the effect of the A and C terms. Several types of shell particles were successfully marketed around this time, however with the introduction of high quality fully porous silica under 10 μm, shell particles faded into the background. In recent years a new generation of core shell particles have become popular within the separation science community. These particles allow fast and efficient separations that can be carried out on conventional HPLC systems. Chapter 1 of this thesis introduces the chemistry of chromatographic stationary phases, with an emphasis on silica bonded phases, particularly focusing on the current state of technology in this area. The main focus is on superficially porous silica particles as a support material for liquid chromatography. A summary of the history and development of these particles over the past few decades is explored, along with current methods of synthesis of shell particles. While commercial shell particles have a rough outer surface, Chapter 2 focuses on the novel approach to growth of smooth surface superficially porous particles in a step-by-step manner. From the Stöber methodology to the seeded growth technique, and finally to the layer-bylayer growth of the porous shell. The superficially porous particles generated in this work have an overall diameter of 2.6 μm with a 350 nm porous shell; these silica particles were characterised using SEM, TEM and BET analysis. The uniform spherical nature of the particles along with their surface area, pore size and particle size distribution are examined in this chapter. I discovered that these smooth surface shell particles can be synthesised to give comparable surface area and pore size in comparison to commercial brands. Chapter 3 deals with the bonding of the particles prepared in Chapter 2 with C18 functionality; one with a narrow and one with a wide particle size distribution. This chapter examines the chromatographic and kinetic performance of these silica stationary phases, and compares them to a commercial superficially porous silica phase with a rough outer surface. I found that the particle size distribution does not seem to be the major contributor to the improvement in efficiency. The surface morphology of the particles appears to play an important role in the packing process of these particles and influences the Van Deemter effects. Chapter 4 focuses on the functionalisation of 2.6 μm smooth surface superficially porous particles with a variety of fluorinated and phenyl silanes. The same processes were carried out on 3.0 μm fully porous silica particles to provide a comparison. All phases were accessed using elemental analysis, thermogravimetric analysis, nitrogen sorption analysis and chromatographically evaluated using the Neue test. I observed comparable results for the 2.6 μm shell pentaflurophenyl propyl silica when compared to 3.0 μm fully porous silica. Chapter 5 moves towards nano-particles, with the synthesis of sub-1 μm superficially porous particles, their characterisation and use in chromatography. The particles prepared are 750 nm in total with a 100 nm shell. All reactions and testing carried out on these 750 nm core shell particles are also carried out on 1.5 μm fully porous particles in order to give a comparative result. The 750 nm core shell particles can be synthesised quickly and are very uniform. The main drawback in their use for HPLC is the system itself due to the backpressure experienced using sub – 1 μm particles. The synthesis of modified Stöber particles is also examined in this chapter with a range of non-porous silica and shell silica from 70 nm – 750 nm being tested for use on a Langmuir – Blodgett system. These smooth surface shell particles have only been in existence since 2009. The results displayed in this thesis demonstrate how much potential smooth surface shell particles have provided more in-depth optimisation is carried out. The results on packing studies reported in this thesis aims to be a starting point for a more sophisticated methodology, which in turn can lead to greater chromatographic improvements.
Resumo:
OBJECTIVE: The Veterans Health Administration has developed My HealtheVet (MHV), a Web-based portal that links veterans to their care in the veteran affairs (VA) system. The objective of this study was to measure diabetic veterans' access to and use of the Internet, and their interest in using MHV to help manage their diabetes. MATERIALS AND METHODS: Cross-sectional mailed survey of 201 patients with type 2 diabetes and hemoglobin A(1c) > 8.0% receiving primary care at any of five primary care clinic sites affiliated with a VA tertiary care facility. Main measures included Internet usage, access, and attitudes; computer skills; interest in using the Internet; awareness of and attitudes toward MHV; demographics; and socioeconomic status. RESULTS: A majority of respondents reported having access to the Internet at home. Nearly half of all respondents had searched online for information about diabetes, including some who did not have home Internet access. More than a third obtained "some" or "a lot" of their health-related information online. Forty-one percent reported being "very interested" in using MHV to help track their home blood glucose readings, a third of whom did not have home Internet access. Factors associated with being "very interested" were as follows: having access to the Internet at home (p < 0.001), "a lot/some" trust in the Internet as a source of health information (p = 0.002), lower age (p = 0.03), and some college (p = 0.04). Neither race (p = 0.44) nor income (p = 0.25) was significantly associated with interest in MHV. CONCLUSIONS: This study found that a diverse sample of older VA patients with sub-optimally controlled diabetes had a level of familiarity with and access to the Internet comparable to an age-matched national sample. In addition, there was a high degree of interest in using the Internet to help manage their diabetes.
Resumo:
Computer based mathematical models describing the aircraft evacuation process and aircraft fire have a role to play in the design and development of safer aircraft, in the implementaion of safer and more rigorous certification criteria and in post mortuum accident investigation. As the cost and risk involved in performing large-scale fire/evacuation experiments for the next generation 'Very Large Aircraft' (VLA) are expected to be high, the development and use of these modelling tools may become essential if these aircraft are to prove a viable reality. By describing the present capabililties and limitations of the EXODUS evacuation model and associated fire models, this paper will examine the future development and data requirements of these models.
Resumo:
Computer based mathematical models describing the aircraft evacuation process have a vital role to play in the design and development of safer aircraft, in the implementation of safer and more rigorous certification criteria, cabin crew training and in post mortuum accident investigation. As the risk of personal injury and costs involved in performing large-scale evacuation experiments for the next generation 'Ultra High Capacity Aircraft' (UHCA) are expected to be high, the development and use of these evacuation modelling tools may become essential if these aircraft are to prove a viable reality. In this paper the capabilities and limitations of the airEXODUS evacuation model are described. Its successful application to the prediction of a recent certification trial, prior to the actual trial taking place, is described. Also described is a newly defined parameter known as OPS which can be used as a measure of evacuation trial optimality. In addition, sample evacuation simulations in the presence of fire atmospheres are described. Finally, the data requiremnets of the airEXODUS evacuation model is discussed along with several projects currently underway at the the Univesity of Greenwich designed to obtain this data. Included in this discussion is a description of the AASK - Aircraft Accident Statistics and Knowledge - data base which contains detailed information from aircraft accident survivors.