955 resultados para Formal methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction-The design of the UK MPharm curriculum is driven by the Royal Pharmaceutical Society of Great Britain (RPSGB) accreditation process and the EU directive (85/432/EEC).[1] Although the RPSGB is informed about teaching activity in UK Schools of Pharmacy (SOPs), there is no database which aggregates information to provide the whole picture of pharmacy education within the UK. The aim of the teaching, learning and assessment study [2] was to document and map current programmes in the 16 established SOPs. Recent developments in programme delivery have resulted in a focus on deep learning (for example, through problem based learning approaches) and on being more student centred and less didactic through lectures. The specific objectives of this part of the study were (a) to quantify the content and modes of delivery of material as described in course documentation and (b) having categorised the range of teaching methods, ask students to rate how important they perceived each one for their own learning (using a three point Likert scale: very important, fairly important or not important). Material and methods-The study design compared three datasets: (1) quantitative course document review, (2) qualitative staff interview and (3) quantitative student self completion survey. All 16 SOPs provided a set of their undergraduate course documentation for the year 2003/4. The documentation variables were entered into Excel tables. A self-completion questionnaire was administered to all year four undergraduates, using a pragmatic mixture of methods, (n=1847) in 15 SOPs within Great Britain. The survey data were analysed (n=741) using SPSS, excluding non-UK students who may have undertaken part of their studies within a non-UK university. Results and discussion-Interviews showed that individual teachers and course module leaders determine the choice of teaching methods used. Content review of the documentary evidence showed that 51% of the taught element of the course was delivered using lectures, 31% using practicals (includes computer aided learning) and 18% small group or interactive teaching. There was high uniformity across the schools for the first three years; variation in the final year was due to the project. The average number of hours per year across 15 schools (data for one school were not available) was: year 1: 408 hours; year 2: 401 hours; year 3: 387 hours; year 4: 401 hours. The survey showed that students perceived lectures to be the most important method of teaching after dispensing or clinical practicals. Taking the very important rating only: 94% (n=694) dispensing or clinical practicals; 75% (n=558) lectures; 52% (n=386) workshops, 50% (n=369) tutorials, 43% (n=318) directed study. Scientific laboratory practices were rated very important by only 31% (n=227). The study shows that teaching of pharmacy to undergraduates in the UK is still essentially didactic through a high proportion of formal lectures and with high levels of staff-student contact. Schools consider lectures still to be the most cost effective means of delivering the core syllabus to large cohorts of students. However, this does limit the scope for any optionality within teaching, the scope for small group work is reduced as is the opportunity to develop multi-professional learning or practice placements. Although novel teaching and learning techniques such as e-learning have expanded considerably over the past decade, schools of pharmacy have concentrated on lectures as the best way of coping with the huge expansion in student numbers. References [1] Council Directive. Concerning the coordination of provisions laid down by law, regulation or administrative action in respect of certain activities in the field of pharmacy. Official Journal of the European Communities 1985;85/432/EEC. [2] Wilson K, Jesson J, Langley C, Clarke L, Hatfield K. MPharm Programmes: Where are we now? Report commissioned by the Pharmacy Practice Research Trust., 2005.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Diabetes is a leading cause of visual impairment in working age population in the UK. This study looked at the causes of Severe Visual Impairment(SVI) in the patients attending diabetic eye clinic and influence on the rate of SVI, over a 12 year period, after introducing retinal screening programmes in the hospital and the community in 1993 (review in 1992, 1998 & 2004). Methods: Medical records of all the patients attending the diabetic eye clinic over a period of 5months(April to August) in 1992, 1998 and 2004 were reviewed. The data collected for each patient included age, sex, ethnic origin, diabetes (type,duration &treatment), the best corrected visual acuity (present and at time of presentation), type and duration of retinopathy and attendance record to both diabetic clinic and diabetic eye clinic. In this study, SVI is defined as a visual acuity of 6/36 or worse in at least one eye. Results: In 1992, of a total 245 patients, 58patients(23.6%) had SVI {38 (15.5% of total) due to diabetic retinopathy [31(12.6%) maculopathy, 2(0.8%) vitreous haemorrhage and 5(2%) retinal detachment] and 20(8.1%) due to non–diabetic retinopathy causes}. In 1998, of a total 297, 77patients(25.9%) had SVI {33(11.1% of total) due to diabetic retinopathy [19(6.4%) maculopathy, 9(3%) proliferative retinopathy, 8(2.7%) vitreous haemorrhage and 3(1%) retinal detachment]and 44(14.8%)due to non–diabetic retinopathy}. In 2004, of a total 471, 72patients(15.2%) had SVI{46(9.7%of total) due to diabetic retinopathy [37(7.8%) maculopathy, 1(0.2%) proliferative retinopathy, 6(1.8%) vitreous haemorrhage and 2(0.4%) retinal detachment]and 26(5.5%) due to non– diabetic retinopathy causes}. Conclusions: Introduction of formalised annual diabetic review including retinal screening and a community retinal screening programme has reduced the rate of severe visual impairment due to diabetic retinopathy, in patients attending diabetic eye clinic, from 15.5% in1992 to 9.7% in2004. Keywords: diabetic retinopathy

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Владимир Димитров - Целта на настоящия доклад е формалната спецификация на релационния модел на данни. Тази спецификация след това може да бъде разширена към Обектно-релационния модел на данни и към Потоците от данни.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this research was to compare the delivery methods as practiced by higher education faculty teaching distance courses with recommended or emerging standard instructional delivery methods for distance education. Previous research shows that traditional-type instructional strategies have been used in distance education and that there has been no training to distance teach. Secondary data, however, appear to suggest emerging practices which could be pooled toward the development of standards. This is a qualitative study based on the constant comparative analysis approach of grounded theory.^ Participants (N = 5) of this study were full-time faculty teaching distance education courses. The observation method used was unobtrusive content analysis of videotaped instruction. Triangulation of data was accomplished through one-on-one in-depth interviews and from literature review. Due to the addition of non-media content being analyzed, a special time-sampling technique was designed by the researcher--influenced by content analyst theories of media-related data--to sample portions of the videotape instruction that were observed and counted. A standardized interview guide was used to collect data from in-depth interviews. Coding was done based on categories drawn from review of literature, and from Cranton and Weston's (1989) typology of instructional strategies. The data were observed, counted, tabulated, analyzed, and interpreted solely by the researcher. It should be noted however, that systematic and rigorous data collection and analysis led to credible data.^ The findings of this study supported the proposition that there are no standard instructional practices for distance teaching. Further, the findings revealed that of the emerging practices suggested by proponents and by faculty who teach distance education courses, few were practiced even minimally. A noted example was the use of lecture and questioning. Questioning, as a teaching tool was used a great deal, with students at the originating site but not with distance students. Lectures were given, but were mostly conducted in traditional fashion--long in duration and with no interactive component.^ It can be concluded from the findings that while there are no standard practices for instructional delivery for distance education, there appears to be sufficient information from secondary and empirical data to initiate some standard instructional practices. Therefore, grounded in this research data is the theory that the way to arrive at some instructional delivery standards for televised distance education is a pooling of the tacitly agreed-upon emerging practices by proponents and practicing instructors. Implicit in this theory is a need for experimental research so that these emerging practices can be tested, tried, and proven, ultimately resulting in formal standards for instructional delivery in television education. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern software systems are often large and complicated. To better understand, develop, and manage large software systems, researchers have studied software architectures that provide the top level overall structural design of software systems for the last decade. One major research focus on software architectures is formal architecture description languages, but most existing research focuses primarily on the descriptive capability and puts less emphasis on software architecture design methods and formal analysis techniques, which are necessary to develop correct software architecture design. ^ Refinement is a general approach of adding details to a software design. A formal refinement method can further ensure certain design properties. This dissertation proposes refinement methods, including a set of formal refinement patterns and complementary verification techniques, for software architecture design using Software Architecture Model (SAM), which was developed at Florida International University. First, a general guideline for software architecture design in SAM is proposed. Second, specification construction through property-preserving refinement patterns is discussed. The refinement patterns are categorized into connector refinement, component refinement and high-level Petri nets refinement. These three levels of refinement patterns are applicable to overall system interaction, architectural components, and underlying formal language, respectively. Third, verification after modeling as a complementary technique to specification refinement is discussed. Two formal verification tools, the Stanford Temporal Prover (STeP) and the Simple Promela Interpreter (SPIN), are adopted into SAM to develop the initial models. Fourth, formalization and refinement of security issues are studied. A method for security enforcement in SAM is proposed. The Role-Based Access Control model is formalized using predicate transition nets and Z notation. The patterns of enforcing access control and auditing are proposed. Finally, modeling and refining a life insurance system is used to demonstrate how to apply the refinement patterns for software architecture design using SAM and how to integrate the access control model. ^ The results of this dissertation demonstrate that a refinement method is an effective way to develop a high assurance system. The method developed in this dissertation extends existing work on modeling software architectures using SAM and makes SAM a more usable and valuable formal tool for software architecture design. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic.^ This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formal, systematic training has always been cited as a major need for the future success of hospitality operations. However, one other aspect of the job might be the development of a train-the-trainer curriculum for hospitality management students. The author studies the relationship between training preparation and training methods utilized by restaurant managers and explores this need.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic. This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the biggest environmental problems of the population is the lack of sewage treatment, especially in rural communities and low-income. The development of technologies for efficient, low-cost sanitation need to be developed to meet the disadvantaged people of this basic service. This work was the implementation proposal of a technology called constructed wetlands, also known as Wastewater Treatment Plant for Roots Zone - ETEZR. The objective was to develop a non- formal environmental education proposal for redevelopment, using outreach methods for residents and deployment of this technology ETEZR in the rural community of Cologne Grebe in Sao Jose dos Pinhais - PR. With technical support from the Paranaense Technical Assistance and Rural Extension Institute -EMATER and the Federal Technological University of Paraná - UTFPR, 5 ETEZR were deployed in the colony through three theoretical and practical workshops, which involved total 67 people from the community 5 technicians EMATER and 13 of the Municipal Town Hall. Após4 months of implementation were carried out two collections of raw wastewater and treated to analyze physical, chemical and biological parameters. The results evaluated by chemical parameters BOD, COD, phosphorus, ammonia nitrogen comparing raw and treated sewage, demonstrate that ETEZR are effective in the treatment of sewage. 5 Seasons minimum and maximum efficiency between the basic parameters analyzed were 52.2 to 95.5% for BOD; 47 to 94.5% for COD; 21.5 to 96% phosphorus; 30-98% for ammonia nitrogen. Oils and greases, and a series of solid also achieved a significant reduction in their values when comparing the raw sewage and treated sewage, and biological parameters evaluated by means of coliforms showed a reduction of 80 to 99%. With the implementation of environmental education process aimed sanitation was possible to evaluate the perception of the population to accept the environmental sanitation technology using the ETEZR, understand the needs and sanitation concepts for the community. This research evaluated the development of the methodology applied by the non-formal environmental education in order to provide subsidies for rural sanitation plan process for the municipality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The study upon which this paper is based was undertaken to understand users’ and non-users’ perceptions concerning facilitators and barriers to equitable and universal access to health care in resource-poor countries such as Malawi. In this study, non-users of health services were defined as people who were not in need of health services or those who had stopped using them due to significant barriers. Methods A total of 80 interviews with non-users of health services were conducted in Rumphi, Ntchisi, Phalombe and Blantyre Districts of Malawi. Interviews focused on why informants were not using formal health services at the time of data collection. In order to identify non-users, snowballing was used health surveillance assistants, village headmen and community members also helped. One focus group discussion was also conducted with non-users of health services who were members of the Zion Church. Results Informants described themselves as non-users of health services due to several reasons: cost of health services; long distances to health facilities; poor attitude of health workers; belief in the effectiveness of traditional medicines; old age and their failure to walk. Others were non-users due to their disability; hence they could not walk over long distances or could not communicate effectively with health providers. Some of these non-users were complete non-users, namely members of the Zion Church and those who believed in traditional medicine, and they stated that nothing could be done to transform them into users of health services. Other non-users stated that they could become users if their challenges were addressed e.g. for those who were non-users of health services due to poor attitudes of health workers, they stated that if these health workers were transferred they would be able to access health services. Conclusions Public health education targeting both health workers and non-users, ensuring a functional outreach program and addressing other health system challenges such as shortage of drugs and human resources would assist in transforming non-users into users of health services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research in human computer interaction (HCI) covers both technological and human behavioural concerns. As a consequence, the contributions made in HCI research tend to be aware to either engineering or the social sciences. In HCI the purpose of practical research contributions is to reveal unknown insights about human behaviour and its relationship to technology. Practical research methods normally used in HCI include formal experiments, field experiments, field studies, interviews, focus groups, surveys, usability tests, case studies, diary studies, ethnography, contextual inquiry, experience sampling, and automated data collection. In this paper, we report on our experience using the evaluation methods focus groups, surveys and interviews and how we adopted these methods to develop artefacts: either interface’s design or information and technological systems. Four projects are examples of the different methods application to gather information about user’s wants, habits, practices, concerns and preferences. The goal was to build an understanding of the attitudes and satisfaction of the people who might interact with a technological artefact or information system. Conversely, we intended to design for information systems and technological applications, to promote resilience in organisations (a set of routines that allow to recover from obstacles) and user’s experiences. Organisations can here also be viewed within a system approach, which means that the system perturbations even failures could be characterized and improved. The term resilience has been applied to everything from the real estate, to the economy, sports, events, business, psychology, and more. In this study, we highlight that resilience is also made up of a number of different skills and abilities (self-awareness, creating meaning from other experiences, self-efficacy, optimism, and building strong relationships) that are a few foundational ingredients, which people should use along with the process of enhancing an organisation’s resilience. Resilience enhances knowledge of resources available to people confronting existing problems.