896 resultados para voip , dispositivi mobili , portabilità , user-friendly
Resumo:
BACKGROUND: The ability to write clearly and effectively is of central importance to the scientific enterprise. Encouraged by the success of simulation environments in other biomedical sciences, we developed WriteSim TCExam, an open-source, Web-based, textual simulation environment for teaching effective writing techniques to novice researchers. We shortlisted and modified an existing open source application - TCExam to serve as a textual simulation environment. After testing usability internally in our team, we conducted formal field usability studies with novice researchers. These were followed by formal surveys with researchers fitting the role of administrators and users (novice researchers) RESULTS: The development process was guided by feedback from usability tests within our research team. Online surveys and formal studies, involving members of the Research on Research group and selected novice researchers, show that the application is user-friendly. Additionally it has been used to train 25 novice researchers in scientific writing to date and has generated encouraging results. CONCLUSION: WriteSim TCExam is the first Web-based, open-source textual simulation environment designed to complement traditional scientific writing instruction. While initial reviews by students and educators have been positive, a formal study is needed to measure its benefits in comparison to standard instructional methods.
Resumo:
Gemstone Team CHIP
Resumo:
The Computer Aided Parallelisation Tools (CAPTools) [Ierotheou, C, Johnson SP, Cross M, Leggett PF, Computer aided parallelisation tools (CAPTools)-conceptual overview and performance on the parallelisation of structured mesh codes, Parallel Computing, 1996;22:163±195] is a set of interactive tools aimed to provide automatic parallelisation of serial FORTRAN Computational Mechanics (CM) programs. CAPTools analyses the user's serial code and then through stages of array partitioning, mask and communication calculation, generates parallel SPMD (Single Program Multiple Data) messages passing FORTRAN. The parallel code generated by CAPTools contains calls to a collection of routines that form the CAPTools communications Library (CAPLib). The library provides a portable layer and user friendly abstraction over the underlying parallel environment. CAPLib contains optimised message passing routines for data exchange between parallel processes and other utility routines for parallel execution control, initialisation and debugging. By compiling and linking with different implementations of the library, the user is able to run on many different parallel environments. Even with today's parallel systems the concept of a single version of a parallel application code is more of an aspiration than a reality. However for CM codes the data partitioning SPMD paradigm requires a relatively small set of message-passing communication calls. This set can be implemented as an intermediate `thin layer' library of message-passing calls that enables the parallel code (especially that generated automatically by a parallelisation tool such as CAPTools) to be as generic as possible. CAPLib is just such a `thin layer' message passing library that supports parallel CM codes, by mapping generic calls onto machine specific libraries (such as CRAY SHMEM) and portable general purpose libraries (such as PVM an MPI). This paper describe CAPLib together with its three perceived advantages over other routes: - as a high level abstraction, it is both easy to understand (especially when generated automatically by tools) and to implement by hand, for the CM community (who are not generally parallel computing specialists); - the one parallel version of the application code is truly generic and portable; - the parallel application can readily utilise whatever message passing libraries on a given machine yield optimum performance.
Resumo:
A new approach to the prediction of bend lifetime in pneumatic conveyors, subject to erosive wear is described. Mathematical modelling is exploited. Commercial Computational Fluid Dynamics (CFD) software is used for the prediction of air flow and particle tracks, and custom code for the modelling of bend erosion and lifetime prediction. The custom code uses a toroidal geometry, and employs a range of empirical data rather than trying to fit classical erosion models to a particular circumstance. The data used was obtained relatively quickly and easily from a gas-blast erosion tester. A full-scale pneumatic conveying rig was used to validate a sample of the bend lifetime predictions, and the results suggest accuracy of within ±65%, using calibration methods. Finally, the work is distilled into user-friendly interactive software that will make erosion lifetime predictions for a wide range of bends under varying conveying conditions. This could be a valuable tool for the pneumatic conveyor design or maintenance engineer.
Resumo:
The Continuous Plankton Recorder Survey has operated in the North Atlantic and North Sea since 1931, providing a unitque multi-decadal dataset of plankton abundance. Over the period since 1931 technology has advanced and the system for storing the CPR data has developed considerably. From 1969 an electronic database was developed to store the results of CPR analysis. Since that time the CPR database has undergone a number of changes due to performance related factors such as processor speed and disk capacity as well as economic factors such as the cost of software. These issues have been overcome and the system for storing and retrieving the data has become more user friendly at every development stage.
Resumo:
The Continuous Plankton Recorder (CPR) survey provides a unique multi- decadal dataset on the abundance of plankton in the North Sea and North Atlantic and is one of only a few monitoring programmes operating at a large spatio- temporal scale. The results of all samples analysed from the survey since 1946 are stored on an Access Database at the Sir Alister Hardy Foundation for Ocean Science (SAHFOS) in Plymouth. The database is large, containing more than two million records (~80 million data points, if zero results are added) for more than 450 taxonomic entities. An open data policy is operated by SAHFOS. However, the data are not on-line and so access by scientists and others wishing to use the results is not interactive. Requests for data are dealt with by the Database Manager. To facilitate access to the data from the North Sea, which is an area of high research interest, a selected set of data for key phytoplankton and zooplankton species has been processed in a form that makes them readily available on CD for research and other applications. A set of MATLAB tools has been developed to provide an interpolated spatio-temporal description of plankton sampled by the CPR in the North Sea, as well as easy and fast access to users in the form of a browser. Using geostatistical techniques, plankton abundance values have been interpolated on a regular grid covering the North Sea. The grid is established on centres of 1 degree longitude x 0.5 degree latitude (~32 x 30 nautical miles). Based on a monthly temporal resolution over a fifty-year period (1948-1997), 600 distribution maps have been produced for 54 zooplankton species, and 480 distribution maps for 57 phytoplankton species over the shorter period 1958-1997. The gridded database has been developed in a user-friendly form and incorporates, as a package on a CD, a set of options for visualisation and interpretation, including the facility to plot maps for selected species by month, year, groups of months or years, long-term means or as time series and contour plots. This study constitutes the first application of an easily accessed and interactive gridded database of plankton abundance in the North Sea. As a further development the MATLAB browser is being converted to a user- friendly Windows-compatible format (WinCPR) for release on CD and via the Web in 2003.
Resumo:
Remote sensing airborne hyperspectral data are routinely used for applications including algorithm development for satellite sensors, environmental monitoring and atmospheric studies. Single flight lines of airborne hyperspectral data are often in the region of tens of gigabytes in size. This means that a single aircraft can collect terabytes of remotely sensed hyperspectral data during a single year. Before these data can be used for scientific analyses, they need to be radiometrically calibrated, synchronised with the aircraft's position and attitude and then geocorrected. To enable efficient processing of these large datasets the UK Airborne Research and Survey Facility has recently developed a software suite, the Airborne Processing Library (APL), for processing airborne hyperspectral data acquired from the Specim AISA Eagle and Hawk instruments. The APL toolbox allows users to radiometrically calibrate, geocorrect, reproject and resample airborne data. Each stage of the toolbox outputs data in the common Band Interleaved Lines (BILs) format, which allows its integration with other standard remote sensing software packages. APL was developed to be user-friendly and suitable for use on a workstation PC as well as for the automated processing of the facility; to this end APL can be used under both Windows and Linux environments on a single desktop machine or through a Grid engine. A graphical user interface also exists. In this paper we describe the Airborne Processing Library software, its algorithms and approach. We present example results from using APL with an AISA Eagle sensor and we assess its spatial accuracy using data from multiple flight lines collected during a campaign in 2008 together with in situ surveyed ground control points.
Resumo:
Remote sensing airborne hyperspectral data are routinely used for applications including algorithm development for satellite sensors, environmental monitoring and atmospheric studies. Single flight lines of airborne hyperspectral data are often in the region of tens of gigabytes in size. This means that a single aircraft can collect terabytes of remotely sensed hyperspectral data during a single year. Before these data can be used for scientific analyses, they need to be radiometrically calibrated, synchronised with the aircraft's position and attitude and then geocorrected. To enable efficient processing of these large datasets the UK Airborne Research and Survey Facility has recently developed a software suite, the Airborne Processing Library (APL), for processing airborne hyperspectral data acquired from the Specim AISA Eagle and Hawk instruments. The APL toolbox allows users to radiometrically calibrate, geocorrect, reproject and resample airborne data. Each stage of the toolbox outputs data in the common Band Interleaved Lines (BILs) format, which allows its integration with other standard remote sensing software packages. APL was developed to be user-friendly and suitable for use on a workstation PC as well as for the automated processing of the facility; to this end APL can be used under both Windows and Linux environments on a single desktop machine or through a Grid engine. A graphical user interface also exists. In this paper we describe the Airborne Processing Library software, its algorithms and approach. We present example results from using APL with an AISA Eagle sensor and we assess its spatial accuracy using data from multiple flight lines collected during a campaign in 2008 together with in situ surveyed ground control points.
Resumo:
Executive Summary The programme of work was commissioned in September 1998 to supply information to underpin the UK’s commitments to protection and conservation of the ecosystems and biodiversity of the marine environment under the 1992 OSPAR Convention on the Protection of the Marine Environment of the North East Atlantic. The programme also provided support for the implementation of the Biodiversity Convention and the EU Habitats Directive. The MarLIN programme initiated a new approach to assessing sensitivity and recoverability characteristics of seabed species and biotopes based on structures (such as the seabed biotopes classification) and criteria (such as for assessing rarity and defining ‘sensitivity’) developed since 1997. It also developed tools to disseminate the information on the Internet. The species researched were those that were listed in conventions and directives, included in Biodiversity Action Plans, or were nationally rare or scarce. In addition, species were researched if they maintained community composition or structure and/or provided a distinctive habitat or were special to or especially abundant in a particular situation or biotope At its conclusion in August 2001, the work carried out under the contract with DETR/DEFRA had: · Developed protocols, criteria and structures for identifying ‘sensitivity’ and ‘recoverability’, which were tested by a programme management group. · Developed a database to hold research data on biology and sensitivity of species and biotopes. · Defined the link between human activities and the environmental factors likely to be affected by those activities. · Developed a user-friendly Web site to access information from the database, on the sensitivity and recoverability characteristics of over 100 species and basic information on over 200 species. Additionally, the project team have: · Brought together and facilitated discussion between current developers and users of electronic resources for environmental management, protection and education in the conference ‘Using Marine Biological Information in the Electronic Age’ (19-21 July 1999). · Contributed to the development of Ecological Quality Objectives for the North Sea (Scheveningen, 11- 3 September 1999 and subsequent papers). · Provided detailed information on species as a supplement to the National Biodiversity Network Gateway demonstration www.searchnbn.net. · Developed a peer-reviewed approach to electronic publication of updateable information. · Promoted the contract results and the MarLIN approach to the support of marine environmental management and protection at European research fora and, through the web site, internationally. The information available through the Web site is now being used by consultants and Government agencies. The DEFRA contract has been of critical importance in establishing the Marine Life Information Network (MarLIN) programme and has encouraged support from other organisations. Other related work in the MarLIN programme is on-going, especially to identify sensitivity of biotopes to support management of SACs (contract from English Nature in collaboration with Scottish Natural Heritage), to access data sources (in collaboration with the National Biodiversity Network) and to establish volunteer recording schemes for marine life. The results of the programme are best viewed on the Web site (www.marlin.ac.uk). Three reports have been produced during the project. A final report detailing the work undertaken, a brochure ‘Identifying the sensitivity of seabed ecosystems’ and a CD-ROM describing the programme and demonstrating the Web site have been delivered as final products in addition to the Web site.
Resumo:
A combination of linkage analyses and association studies are currently employed to promote the identification of genetic factors contributing to inherited renal disease. We have standardized and merged complex genetic data from disparate sources, creating unique chromosomal maps to enhance genetic epidemiological investigations. This database and novel renal maps effectively summarize genomic regions of suggested linkage, association, or chromosomal abnormalities implicated in renal disease. Chromosomal regions associated with potential intermediate clinical phenotypes have been integrated, adding support for particular genomic intervals. More than 500 reports from medical databases, published scientific literature, and the World Wide Web were interrogated for relevant renal-related information. Chromosomal regions highlighted for prioritized investigation of renal complications include 3q13-26, 6q22-27, 10p11-15, 16p11-13, and 18q22. Combined genetic and physical maps are effective tools to organize genetic data for complex diseases. These renal chromosome maps provide insights into renal phenotype-genotype relationships and act as a template for future genetic investigations into complex renal diseases. New data from individual researchers and/or future publications can be readily incorporated to this resource via a user-friendly web-form accessed from the website: www.qub.ac.uk/neph-res/CORGI/index.php.
Resumo:
Ligand prediction has been driven by a fundamental desire to understand more about how biomolecules recognize their ligands and by the commercial imperative to develop new drugs. Most of the current available software systems are very complex and time-consuming to use. Therefore, developing simple and efficient tools to perform initial screening of interesting compounds is an appealing idea. In this paper, we introduce our tool for very rapid screening for likely ligands (either substrates or inhibitors) based on reasoning with imprecise probabilistic knowledge elicited from past experiments. Probabilistic knowledge is input to the system via a user-friendly interface showing a base compound structure. A prediction of whether a particular compound is a substrate is queried against the acquired probabilistic knowledge base and a probability is returned as an indication of the prediction. This tool will be particularly useful in situations where a number of similar compounds have been screened experimentally, but information is not available for all possible members of that group of compounds. We use two case studies to demonstrate how to use the tool.
Resumo:
Monensin, a carboxylic acid ionophore, is commonly fed to poultry to control coccidiosis. A method for rapid analysis of unextracted poultry plasma samples has been developed based on a novel immunoassay format: one-step all-in-one dry reagent time resolved fluorimetry. All assay specific components were pre-dried onto microtitration plate wells. Only addition of the serum sample diluted in assay buffer was required to perform analysis. Results were available one hour after sample addition. The limit of detection (mean + 3s) of the assay calculated from the analysis of 23 known negative samples was 14.2 ng ml(-1). Intra- and inter-assay RSD were determined as 15.2 and 7.4%, respectively, using a plasma sample fortified with 50 ng ml(-1) monensin. Eight broiler chickens were fed monensin at a dose rate of 120 mg kg(-1) feed for one week, blood sampled then slaughtered without drug withdrawal. Plasma monensin concentrations, as determined by the fluoroimmunoassay ranged from 101-297 ng ml(-1). This compared with monensin liver concentrations, determined by LC-MS, which ranged fi om 13-41 ng g(-1). The fluoroimmunoassay described is extremely user friendly, gives particularly rapid results and is suitable for the detection and quantification of plasma monensin residues. Data from medicated poultry suggest that analysis of plasma may be useful in predicting the extent of monensin liver residues.
Resumo:
This paper describes the design, application, and evaluation of a user friendly, flexible, scalable and inexpensive Advanced Educational Parallel (AdEPar) digital signal processing (DSP) system based on TMS320C25 digital processors to implement DSP algorithms. This system will be used in the DSP laboratory by graduate students to work on advanced topics such as developing parallel DSP algorithms. The graduating senior students who have gained some experience in DSP can also use the system. The DSP laboratory has proved to be a useful tool in the hands of the instructor to teach the mathematically oriented topics of DSP that are often difficult for students to grasp. The DSP laboratory with assigned projects has greatly improved the ability of the students to understand such complex topics as the fast Fourier transform algorithm, linear and circular convolution, the theory and design of infinite impulse response (IIR) and finite impulse response (FIR) filters. The user friendly PC software support of the AdEPar system makes it easy to develop DSP programs for students. This paper gives the architecture of the AdEPar DSP system. The communication between processors and the PC-DSP processor communication are explained. The parallel debugger kernels and the restrictions of the system are described. The programming in the AdEPar is explained, and two benchmarks (parallel FFT and DES) are presented to show the system performance.
Resumo:
The international introduction of electric vehicles (EVs) will see a change in private passenger car usage, operation and management. There are many stakeholders, but currently it appears that the automotive industry is focused on EV manufacture, governments and policy makers have highlighted the potential environmental and job creation opportunities while the electricity sector is preparing for an additional electrical load on the grid system. If the deployment of EVs is to be successful the introduction of international EV standards, universal charging hardware infrastructure, associated universal peripherals and user-friendly software on public and private property is necessary. The focus of this paper is to establish the state-of-the-art in EV charging infrastructure, which includes a review of existing and proposed international standards, best practice and guidelines under consideration or recommendation.
Resumo:
Background: A suite of 10 online virtual patients developed using the IVIMEDS ‘Riverside’ authoring tool has been introduced into our undergraduate general practice clerkship. These cases provide a multimedia-rich experience to students. Their interactive nature promotes the development of clinical reasoning skills such as discriminating key clinical features, integrating information from a variety of sources and forming diagnoses and management plans.
Aims: To evaluate the usefulness and usability of a set of online virtual patients in an undergraduate general practice clerkship.
Method: Online questionnaire completed by students after their general practice placement incorporating the System Usability Scale questionnaire.
Results: There was a 57% response rate. Ninety-five per cent of students agreed that the online package was a useful learning tool and ranked virtual patients third out of six learning modalities. Questions and answers and the use of images and videos were all rated highly by students as useful learning methods. The package was perceived to have a high level of usability among respondents.
Conclusion: Feedback from students suggest that this implementation of virtual patients, set in primary care, is user friendly and rated as a valuable adjunct to their learning. The cost of production of such learning resources demands close attention to design.