898 resultados para tool - use
Resumo:
There is an increasing pressure on university staff to provide ever more information and resources to students. This study investigated student opinions on (audio) podcasts and (video) vodcasts and how well they met requirements and aided learning processes. Two experiments within the Aston University looked at student opinion on, and usage of, podcasts and vodcasts for a selection of their psychology lectures. Recordings were produced first using a hand-held camcorder, and then using the in-house media department. WebCT was used to distribute the podcasts and vodcasts, attitude questionnaires were then circulated at two time points. Overall students indicated that podcasts and vodcasts were a beneficial addition resource for learning, particularly when used in conjunction with lecturers’ slides and as a tool for revision/assessment. The online material translated into students having increased understanding of the material, which supplemented and enhanced their learning without being a substitute for traditional lectures. There is scope for the provision of portable media files to become standard practice within higher education; integrating distance and online learning with traditional approaches to improve teaching and learning.
Resumo:
Many software engineers have found that it is difficult to understand, incorporate and use different formal models consistently in the process of software developments, especially for large and complex software systems. This is mainly due to the complex mathematical nature of the formal methods and the lack of tool support. It is highly desirable to have software models and their related software artefacts systematically connected and used collaboratively, rather than in isolation. The success of the Semantic Web, as the next generation of Web technology, can have profound impact on the environment for formal software development. It allows both the software engineers and machines to understand the content of formal models and supports more effective software design in terms of understanding, sharing and reusing in a distributed manner. To realise the full potential of the Semantic Web in formal software development, effectively creating proper semantic metadata for formal software models and their related software artefacts is crucial. This paper proposed a framework that allows users to interconnect the knowledge about formal software models and other related documents using the semantic technology. We first propose a methodology with tool support is proposed to automatically derive ontological metadata from formal software models and semantically describe them. We then develop a Semantic Web environment for representing and sharing formal Z/OZ models. A method with prototype tool is presented to enhance semantic query to software models and other artefacts. © 2014.
Resumo:
Summarizing the accumulated experience for a long time in the polyparametric cognitive modeling of different physiological processes (electrocardiogram, electroencephalogram, electroreovasogram and others) and the development on this basis some diagnostics methods give ground for formulating a new methodology of the system analysis in biology. The gist of the methodology consists of parametrization of fractals of electrophysiological processes, matrix description of functional state of an object with a unified set of parameters, construction of the polyparametric cognitive geometric model with artificial intelligence algorithms. The geometry model enables to display the parameter relationships are adequate to requirements of the system approach. The objective character of the elements of the models and high degree of formalization which facilitate the use of the mathematical methods are advantages of these models. At the same time the geometric images are easily interpreted in physiological and clinical terms. The polyparametric modeling is an object oriented tool possessed advances functional facilities and some principal features.
Resumo:
METPEX is a 3 year, FP7 project which aims to develop a PanEuropean tool to measure the quality of the passenger's experience of multimodal transport. Initial work has led to the development of a comprehensive set of variables relating to different passenger groups, forms of transport and journey stages. This paper addresses the main challenges in transforming the variables into usable, accessible computer based tools allowing for the real time collection of information, across multiple journey stages in different EU countries. Non-computer based measurement instruments will be used to gather information from those who may not have or be familiar with mobile technology. Smartphone-based measurement instruments will also be used, hosted in two applications. The mobile applications need to be easy to use, configurable and adaptable according to the context of use. They should also be inherently interesting and rewarding for the participant, whilst allowing for the collection of high quality, valid and reliable data from all journey types and stages (from planning, through to entry into and egress from different transport modes, travel on public and personal vehicles and support of active forms of transport (e.g. cycling and walking). During all phases of the data collection and processing, the privacy of the participant is highly regarded and is ensured. © 2014 Springer International Publishing.
Resumo:
The mechanisms for regulating PIKfyve complex activity are currently emerging. The PIKfyve complex, consisting of the phosphoinositide kinase PIKfyve (also known as FAB1), VAC14 and FIG4, is required for the production of phosphatidylinositol-3,5-bisphosphate (PI(3,5)P2). PIKfyve function is required for homeostasis of the endo/lysosomal system and is crucially implicated in neuronal function and integrity, as loss of function mutations in the PIKfyve complex lead to neurodegeneration in mouse models and human patients. Our recent work has shown that the intracellular domain of the Amyloid Precursor Protein (APP), a molecule central to the aetiology of Alzheimer's disease binds to VAC14 and enhances PIKfyve function. Here we utilise this recent advance to create an easy-to-use tool for increasing PIKfyve activity in cells. We fused APP's intracellular domain (AICD) to the HIV TAT domain, a cell permeable peptide allowing proteins to penetrate cells. The resultant TAT-AICD fusion protein is cell permeable and triggers an increase of PI(3,5)P2. Using the PI(3,5)P2 specific GFP-ML1Nx2 probe we show that cell-permeable AICD alters PI(3,5)P2 dynamics. TAT-AICD also provides partial protection from pharmacological inhibition of PIKfyve. All three lines of evidence show that the APP intracellular domain activates the PIKfyve complex in cells, a finding that is important for our understanding of the mechanism of neurodegeneration in Alzheimer's disease.
Resumo:
A significant body of research investigates the acceptance of computer-based support (including devices and applications ranging from e-mail to specialized clinical systems, like PACS) among clinicians. Much of this research has focused on measuring the usability of systems using characteristics related to the clarity of interactions and ease of use. We propose that an important attribute of any clinical computer-based support tool is the intrinsic motivation of the end-user (i.e. a clinician) to use the system in practice. In this paper we present the results of a study that investigated factors motivating medical doctors (MDs) to use computer-based support. Our results demonstrate that MDs value computer-based support, find it useful and easy to use, however, uptake is hindered by perceived incompetence, and pressure and tension associated with using technology.
Resumo:
Heterogeneous datasets arise naturally in most applications due to the use of a variety of sensors and measuring platforms. Such datasets can be heterogeneous in terms of the error characteristics and sensor models. Treating such data is most naturally accomplished using a Bayesian or model-based geostatistical approach; however, such methods generally scale rather badly with the size of dataset, and require computationally expensive Monte Carlo based inference. Recently within the machine learning and spatial statistics communities many papers have explored the potential of reduced rank representations of the covariance matrix, often referred to as projected or fixed rank approaches. In such methods the covariance function of the posterior process is represented by a reduced rank approximation which is chosen such that there is minimal information loss. In this paper a sequential Bayesian framework for inference in such projected processes is presented. The observations are considered one at a time which avoids the need for high dimensional integrals typically required in a Bayesian approach. A C++ library, gptk, which is part of the INTAMAP web service, is introduced which implements projected, sequential estimation and adds several novel features. In particular the library includes the ability to use a generic observation operator, or sensor model, to permit data fusion. It is also possible to cope with a range of observation error characteristics, including non-Gaussian observation errors. Inference for the covariance parameters is explored, including the impact of the projected process approximation on likelihood profiles. We illustrate the projected sequential method in application to synthetic and real datasets. Limitations and extensions are discussed. © 2010 Elsevier Ltd.
Resumo:
The article investigates the division between member states of the European Union considering the aspect of their level of information and communication technology (ICT) development focusing on e-learning. With the help of discriminant analysis the countries are categorized into groups based on their ICT maturity and e-learning literacy level of development. Making a comparison with a benchmarking tool, the ITU (International Telecommunication Union)’s ICT Development Index (IDI) the results are confirmed partly correct. The article tries to find economical explanations for the re-grouping of the countries ranking. Finally the author examines the reliability of Hungary’s ranking results and the factors which may affect this divergence from the real picture.
Resumo:
We have witnessed a rising interest – by both academic and managerial field – in the marketing application of Web 2.0 techniques. Yet the effective impact and change it brings is still unclarified. The importance of Web 2.0 is constantly on the rise. We see consumers join social networks, using social tools in an ever greater number, therefor it gives companies a new and effective tool for marketing communication and other marketing-related activities. In our research, we first aim to clarify the definition and boundaries of Web 2.0. Then through a literature review we collect some of the most important areas of marketing to be affected by this seemingly technological change. We also have a brief overview of challenges and risks firms face in this new environment.
Resumo:
The Ellison Executive Mentoring Inclusive Community Building (ICB) Model is a paradigm for initiating and implementing projects utilizing executives and professionals from a variety of fields and industries, university students, and pre-college students. The model emphasizes adherence to ethical values and promotes inclusiveness in community development. It is a hierarchical model in which actors in each succeeding level of operation serve as mentors to the next. Through a three-step process—content, process, and product—participants must be trained with this mentoring and apprenticeship paradigm in conflict resolution, and they receive sensitivity and diversity training through an interactive and dramatic exposition. ^ The content phase introduces participants to the model's philosophy, ethics, values and methods of operation. The process used to teach and reinforce its precepts is the mentoring and apprenticeship activities and projects in which the participants engage and whose end product demonstrates their knowledge and understanding of the model's concepts. This study sought to ascertain from the participants' perspectives whether the model's mentoring approach is an effective means of fostering inclusiveness, based upon their own experiences in using it. The research utilized a qualitative approach and included data from field observations, individual and group interviews, and written accounts of participants' attitudes. ^ Participants complete ICB projects utilizing The Ellison Model as a method of development and implementation. They generally perceive that the model is a viable tool for dealing with diversity issues whether at work, at school, or at home. The projects are also instructional in that whether participants are mentored or serve as apprentices, they gain useful skills and knowledge about their careers. Since the model is relatively new, there is ample room for research in a variety of areas including organizational studies to determine its effectiveness in combating problems related to various kinds of discrimination. ^
Resumo:
A comprehensive investigation of sensitive ecosystems in South Florida with the main goal of determining the identity, spatial distribution, and sources of both organic biocides and trace elements in different environmental compartments is reported. This study presents the development and validation of a fractionation and isolation method of twelve polar acidic herbicides commonly applied in the vicinity of the study areas, including e.g. 2,4-D, MCPA, dichlorprop, mecroprop, picloram in surface water. Solid phase extraction (SPE) was used to isolate the analytes from abiotic matrices containing large amounts of dissolved organic material. Atmospheric-pressure ionization (API) with electrospray ionization in negative mode (ESP-) in a Quadrupole Ion Trap mass spectrometer was used to perform the characterization of the herbicides of interest. ^ The application of Laser Ablation-ICP-MS methodology in the analysis of soils and sediments is reported in this study. The analytical performance of the method was evaluated on certified standards and real soil and sediment samples. Residential soils were analyzed to evaluate feasibility of using the powerful technique as a routine and rapid method to monitor potential contaminated sites. Forty eight sediments were also collected from semi pristine areas in South Florida to conduct screening of baseline levels of bioavailable elements in support of risk evaluation. The LA-ICP-MS data were used to perform a statistical evaluation of the elemental composition as a tool for environmental forensics. ^ A LA-ICP-MS protocol was also developed and optimized for the elemental analysis of a wide range of elements in polymeric filters containing atmospheric dust. A quantitative strategy based on internal and external standards allowed for a rapid determination of airborne trace elements in filters containing both contemporary African dust and local dust emissions. These distributions were used to qualitative and quantitative assess differences of composition and to establish provenance and fluxes to protected regional ecosystems such as coral reefs and national parks. ^
Resumo:
The internet has been heralded as the communications and marketing tool of the future for the hospitality industry. Both corporate executives and information technology experts feel the hotel of the future cannot do without a presence on the Web. Yet, do the actions of hospitality operators in the field reflect this optimism? This article reports on a study done among property managers in the U.S. lodging industry to determine the actual use of the internet in hotel properties of various types and sizes. Additionally, it addresses development and maintenance issues related to internet use.
Resumo:
Aim: to determine cut off points for The Homeostatic Model Assessment Index 1 and 2 (HOMA-1 and HOMA-2) for identifying insulin resistance and metabolic syndrome among a Cuban-American population. Study Design: Cross sectional. Place and Duration of Study: Florida International University, Robert Stempel School of Public Health and Social Work, Department of Dietetics and Nutrition, Miami, FL from July 2010 to December 2011. Methodology: Subjects without diabetes residing in South Florida were enrolled (N=146, aged 37 to 83 years). The HOMA1-IR and HOMA2-IR 90th percentile in the healthy group (n=75) was used as the cut-off point for insulin resistance. A ROC curve was constructed to determine the cut-off point for metabolic syndrome. Results: HOMA1-IR was associated with BMI, central obesity, and triglycerides (P3.95 and >2.20 and for metabolic syndrome were >2.98 (63.4% sensitivity and 73.3% specificity) and >1.55 (60.6% sensitivity and 66.7% specificity), respectively. Conclusion: HOMA cut-off points may be used as a screening tool to identify insulin resistance and metabolic syndrome among Cuban-Americans living in South Florida.
Resumo:
OBJECTIVE: To evaluate the validity of hemoglobin A1C (A1C) as a diagnostic tool for type 2 diabetes and to determine the most appropriate A1C cutoff point for diagnosis in a sample of Haitian-Americans. SUBJECTS AND METHODS: Subjects (n = 128) were recruited from Miami-Dade and Broward counties, FL. Receiver operating characteristics (ROC) analysis was run in order to measure sensitivity and specificity of A1C for detecting diabetes at different cutoff points. RESULTS: The area under the ROC curve was 0.86 using fasting plasma glucose ≥ 7.0 mmol/L as the gold standard. An A1C cutoff point of 6.26% had sensitivity of 80% and specificity of 74%, whereas an A1C cutoff point of 6.50% (recommended by the American Diabetes Association – ADA) had sensitivity of 73% and specificity of 89%. CONCLUSIONS: A1C is a reliable alternative to fasting plasma glucose in detecting diabetes in this sample of Haitian-Americans. A cutoff point of 6.26% was the optimum value to detect type 2 diabetes.
Resumo:
The purpose of this research was to apply model checking by using a symbolic model checker on Predicate Transition Nets (PrT Nets). A PrT Net is a formal model of information flow which allows system properties to be modeled and analyzed. The aim of this thesis was to use the modeling and analysis power of PrT nets to provide a mechanism for the system model to be verified. Symbolic Model Verifier (SMV) was the model checker chosen in this thesis, and in order to verify the PrT net model of a system, it was translated to SMV input language. A software tool was implemented which translates the PrT Net into SMV language, hence enabling the process of model checking. The system includes two parts: the PrT net editor where the representation of a system can be edited, and the translator which converts the PrT net into an SMV program.