965 resultados para formal methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tradicionalment, la reproducció del mon real se'ns ha mostrat a traves d'imatges planes. Aquestes imatges se solien materialitzar mitjançant pintures sobre tela o be amb dibuixos. Avui, per sort, encara podem veure pintures fetes a ma, tot i que la majoria d'imatges s'adquireixen mitjançant càmeres, i es mostren directament a una audiència, com en el cinema, la televisió o exposicions de fotografies, o be son processades per un sistema computeritzat per tal d'obtenir un resultat en particular. Aquests processaments s'apliquen en camps com en el control de qualitat industrial o be en la recerca mes puntera en intel·ligència artificial. Aplicant algorismes de processament de nivell mitja es poden obtenir imatges 3D a partir d'imatges 2D, utilitzant tècniques ben conegudes anomenades Shape From X, on X es el mètode per obtenir la tercera dimensió, i varia en funció de la tècnica que s'utilitza a tal nalitat. Tot i que l'evolució cap a la càmera 3D va començar en els 90, cal que les tècniques per obtenir les formes tridimensionals siguin mes i mes acurades. Les aplicacions dels escàners 3D han augmentat considerablement en els darrers anys, especialment en camps com el lleure, diagnosi/cirurgia assistida, robòtica, etc. Una de les tècniques mes utilitzades per obtenir informació 3D d'una escena, es la triangulació, i mes concretament, la utilització d'escàners laser tridimensionals. Des de la seva aparició formal en publicacions científiques al 1971 [SS71], hi ha hagut contribucions per solucionar problemes inherents com ara la disminució d'oclusions, millora de la precisió, velocitat d'adquisició, descripció de la forma, etc. Tots i cadascun dels mètodes per obtenir punts 3D d'una escena te associat un procés de calibració, i aquest procés juga un paper decisiu en el rendiment d'un dispositiu d'adquisició tridimensional. La nalitat d'aquesta tesi es la d'abordar el problema de l'adquisició de forma 3D, des d'un punt de vista total, reportant un estat de l'art sobre escàners laser basats en triangulació, provant el funcionament i rendiment de diferents sistemes, i fent aportacions per millorar la precisió en la detecció del feix laser, especialment en condicions adverses, i solucionant el problema de la calibració a partir de mètodes geomètrics projectius.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to present two multi-criteria decision-making models, including an Analytic Hierarchy Process (AHP) model and an Analytic Network Process (ANP) model for the assessment of deconstruction plans and to make a comparison between the two models with an experimental case study. Deconstruction planning is under pressure to reduce operation costs, adverse environmental impacts and duration, in the meanwhile to improve productivity and safety in accordance with structure characteristics, site conditions and past experiences. To achieve these targets in deconstruction projects, there is an impending need to develop a formal procedure for contractors to select a most appropriate deconstruction plan. Because numbers of factors influence the selection of deconstruction techniques, engineers definitely need effective tools to conduct the selection process. In this regard, multi-criteria decision-making methods such as AHP have been adopted to effectively support deconstruction technique selection in previous researches. in which it has been proved that AHP method can help decision-makers to make informed decisions on deconstruction technique selection based on a sound technical framework. In this paper, the authors present the application and comparison of two decision-making models including the AHP model and the ANP model for deconstruction plan assessment. The paper concludes that both AHP and ANP are viable and capable tools for deconstruction plan assessment under the same set of evaluation criteria. However, although the ANP can measure relationship among selection criteria and their sub-criteria, which is normally ignored in the AHP, the authors also indicate that whether the ANP model can provide a more accurate result should be examined in further research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Policy hierarchies and automated policy refinement are powerful approaches to simplify administration of security services in complex network environments. A crucial issue for the practical use of these approaches is to ensure the validity of the policy hierarchy, i.e. since the policy sets for the lower levels are automatically derived from the abstract policies (defined by the modeller), we must be sure that the derived policies uphold the high-level ones. This paper builds upon previous work on Model-based Management, particularly on the Diagram of Abstract Subsystems approach, and goes further to propose a formal validation approach for the policy hierarchies yielded by the automated policy refinement process. We establish general validation conditions for a multi-layered policy model, i.e. necessary and sufficient conditions that a policy hierarchy must satisfy so that the lower-level policy sets are valid refinements of the higher-level policies according to the criteria of consistency and completeness. Relying upon the validation conditions and upon axioms about the model representativeness, two theorems are proved to ensure compliance between the resulting system behaviour and the abstract policies that are modelled.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective:

To survey prevocational doctors working in Australian hospitals on aspects of postgraduate learning.
Participants and setting:

470 prevocational doctors in 36 health services in Australia, August 2003 to October 2004.
Design:

Cross-sectional cohort survey with a mix of ordinal multicategory questions and free text.
Main outcome measures:

Perceived preparedness for aspects of clinical practice; perceptions of the quantity and usefulness of current teaching and learning methods and desired future exposure to learning methods.
Results:

64% (299/467) of responding doctors felt generally prepared for their job, 91% (425/469) felt prepared for dealing with patients, and 70% (325/467) for dealing with relatives. A minority felt prepared for medicolegal problems (23%, 106/468), clinical emergencies (31%, 146/469), choosing a career (40%, 188/468), or performing procedures (45%, 213/469). Adequate contact with registrars was reported by 90% (418/465) and adequate contact with consultants by 56% (257/466); 20% (94/467) reported exposure to clinical skills training and 11% (38/356) to high-fidelity simulation. Informal registrar contact was described as useful or very useful by 94% (433/463), and high-fidelity simulation by 83% (179/216). Most prevocational doctors would prefer more formal instruction from their registrars (84%, 383/456) and consultants (81%, 362/447); 84% (265/316) want increased exposure to high-fidelity simulation and 81% (283/350) to professional college tutorials.
Conclusion:

Our findings should assist planning and development of training programs for prevocational doctors in Australian hospitals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Requirements engineering is a commencing phase in the development of either software applications or information systems. It is concerned with understanding and specifying the customer's requirements of the system to be delivered. Throughout the literature, this is agreed to be one of the most crucial and, unfortunately, problematic phases in development. Despite the diversity of research directions, approaches and methods, the question of process understanding and management is still limited. Among contemporary approaches to the improvement of the current practice of Requirements Engineering, Formal Object-Oriented Method (FOOM) has been introduced as a new promising solution. The FOOM approach to requirements engineering is based on a synthesis of socio-organisational theory, the object-oriented approach, and mathematical formal specification. The entire FOOM specification process is evolutionary and involves a large volume of changes in requirements. During this process, requirements evolve through various forms of informal, semi-formal, and formal while maintaining a semantic link between these forms and, most importantly, conforming to the customer's requirements. A deep understanding of the complexity of the requirements model and its dynamics is critical in improving requirements engineering process management. This thesis investigates the benefits of documenting both the evolution of the requirements model and the rationale for that evolution. Design explanation explains and justifies the deliberations of, and decisions made during, the design activity. In this thesis, design explanation is used to describe the requirements engineering process in order to improve understandability of, and traceability within, the evolving requirements specification. The design explanation recorded during this research project is also useful in assisting the researcher in gaining insights into the creativity and opportunistic characteristics of the requirements engineering process. This thesis offers an interpretive investigation into incorporating design explanation within FOOM in order to extend and advantage the method. The researcher's interpretation and analysis of collected data highlight an insight-driven and opportunistic process rather than a strictly and systematically predefined one. In fact, the process was not smoothly evolutionary, but involved occasional 'crisis' points at which the model was reconceptualised, simplified and restructured. Therefore, contributions of the thesis lie not only in an effective incorporation of design explanation within FOOM, but also a deep understanding of the dynamic process of requirements engineering. The new understanding of the complexity of the requirements model and its dynamics suggests new directions for future research and forms a basis for a new approach to process management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

 Today, Digital Systems and Services for Technology Supported Learning and Education are recognized as the key drivers to transform the way that individuals, groups and organizations “learn” and the way to “assess learning” in 21st Century. These transformations influence: Objectives - moving from acquiring new “knowledge” to developing new and relevant “competences”; Methods – moving from “classroom” based teaching to “context-aware” personalized learning; and Assessment – moving from “life-long” degrees and certifications to “on-demand” and “in-context” accreditation of qualifications. Within this context, promoting Open Access to Formal and Informal Learning, is currently a key issue in the public discourse and the global dialogue on Education, including Massive Open Online Courses (MOOCs) and Flipped School Classrooms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A dissertação consiste de quarto partes. A primeira é uma resenha sobre a incorporação formal de preocupações com posição relativa (relative concerns) nos modelos econômicos. Creio que existe espaço para uma resenha desse tipo visto que nenhuma foi feita desde 1992, quando começou a literatura relevante para a presente discussão. O ensaio seguinte consiste da prova de um teorema sobre a distribuição igualitária de riqueza no contexto de preocupação com o status social. A conclusão é bastante cínica em relação a uma das vacas sagradas da maioria dos utopismos. O terceiro ensaio é de novo um teorema, de novo como conclusões cínicas, a respeito da intuição que uma sociedade com os membros suficientemente (mas não perfeitamente) altruístas seria estável e sem conflitos. O último ensaio é uma conjectura baseada num artigo recente de David Friedman. A minha ambição foi tentar explicar o comportamento aparentemente puramente caprichoso e irracional de law enforment nos regimes ditatoriais. O que une os ensaios é uma tentativa de rever algumas discussões típicas até mais das ciências humanas que sociais valendo se do instrumental formal da teoria dos jogos e a intolerância à ambiguidades nutrida pelas últimas gerações dos economistas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

MODSI is a multi-models tool for information systems modeling. A modeling process in MODSI can be driven according to three different approaches: informal, semi-formal and formal. The MODSI tool is therefore based on the linked usage of these three modeling approaches. It can be employed at two different levels: the meta-modeling of a method and the modeling of an information system.In this paper we start presenting different types of modeling by making an analysis of their particular features. Then, we introduce the meta-model defined in our tool, as well as the tool functional architecture. Finally, we describe and illustrate the various usage levels of this tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The decreasing number of women who are graduating in the Science, Technology, Engineering and Mathematics (STEM) fields continues to be a major concern. Despite national support in the form of grants provided by National Science Foundation, National Center for Information and Technology and legislation passed such as the Deficit Reduction Act of 2005 that encourages women to enter the STEM fields, the number of women actually graduating in these fields is surprisingly low. This research study focuses on a robotics competition and its ability to engage female adolescents in STEM curricula. Data have been collected to help explain why young women are reticent to take technology or engineering type courses in high school and college. Factors that have been described include attitudes, parental support, social aspects, peer pressure, and lack of role models. Often these courses were thought to have masculine and “nerdy” overtones. The courses were usually majority male enrollments and appeared to be very competitive. With more female adolescents engaging in this type of competitive atmosphere, this study gathered information to discover what about the competition appealed to these young women. Focus groups were used to gather information from adolescent females who were participating in the First Lego League (FLL) and CEENBoT competitions. What enticed them to participate in a curriculum that data demonstrated many of their peers avoided? FLL and CEENBoT are robotics programs based on curricula that are taught in afterschool programs in non-formal environments. These programs culminate in a very large robotics competition. My research questions included: What are the factors that encouraged participants to participate in the robotics competition? What was the original enticement to the FLL and CEENBoT programs? What will make participants want to come back and what are the participants’ plans for the future? My research mirrored data of previous findings such as lack of role models, the need for parental support, social stigmatisms and peer pressure are still major factors that determine whether adolescent females seek out STEM activities. An interesting finding, which was an exception to previous findings, was these female adolescents enjoyed the challenge of the competition. The informal learning environments encouraged an atmosphere of social engagement and cooperative learning. Many volunteers that led the afterschool programs were women (role models) and a majority of parents showed support by accommodating an afterschool situation. The young women that were engaged in the competition noted it was a friendly competition, but they were all there to win. All who participated in the competition had a similar learning environment: competitive but cooperative. Further research is needed to determine if it is the learning environment that lures adolescent females to the program and entices them to continue in the STEM fields or if it is the competitive aspect of the culminating activity. Advisors: James King and Allen Steckelberg

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Although the release of cardiac biomarkers after percutaneous (PCI) or surgical revascularization (CABG) is common, its prognostic significance is not known. Questions remain about the mechanisms and degree of correlation between the release, the volume of myocardial tissue loss, and the long-term significance. Delayed-enhancement of cardiac magnetic resonance (CMR) consistently quantifies areas of irreversible myocardial injury. To investigate the quantitative relationship between irreversible injury and cardiac biomarkers, we will evaluate the extent of irreversible injury in patients undergoing PCI and CABG and relate it to postprocedural modifications in cardiac biomarkers and long-term prognosis. Methods/Design: The study will include 150 patients with multivessel coronary artery disease (CAD) with left ventricle ejection fraction (LVEF) and a formal indication for CABG; 50 patients will undergo CABG with cardiopulmonary bypass (CPB); 50 patients with the same arterial and ventricular condition indicated for myocardial revascularization will undergo CABG without CPB; and another 50 patients with CAD and preserved ventricular function will undergo PCI using stents. All patients will undergo CMR before and after surgery or PCI. We will also evaluate the release of cardiac markers of necrosis immediately before and after each procedure. Primary outcome considered is overall death in a 5-year follow-up. Secondary outcomes are levels of CK-MB isoenzyme and I-Troponin in association with presence of myocardial fibrosis and systolic left ventricle dysfunction assessed by CMR. Discussion: The MASS-V Trial aims to establish reliable values for parameters of enzyme markers of myocardial necrosis in the absence of manifest myocardial infarction after mechanical interventions. The establishments of these indices have diagnostic value and clinical prognosis and therefore require relevant and different therapeutic measures. In daily practice, the inappropriate use of these necrosis markers has led to misdiagnosis and therefore wrong treatment. The appearance of a more sensitive tool such as CMR provides an unprecedented diagnostic accuracy of myocardial damage when correlated with necrosis enzyme markers. We aim to correlate laboratory data with imaging, thereby establishing more refined data on the presence or absence of irreversible myocardial injury after the procedure, either percutaneous or surgical, and this, with or without the use of cardiopulmonary bypass.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives Our objective in this study was to compare assistance received by individuals in the United States and Sweden with characteristics associated with low, moderate, or high 1-year placement risk in the United States. Methods We used longitudinal nationally representative data from 4,579 participants aged 75 years and older in the 1992 and 1993 waves of the Medicare Current Beneficiary Survey (MCBS) and cross-sectional data from 1,379 individuals aged 75 years and older in the Swedish Aging at Home (AH) national survey for comparative purposes. We developed a logistic regression equation using U.S. data to identify individuals with 3 levels (low, moderate, or high) of predicted 1-year institutional placement risk. Groups with the same characteristics were identified in the Swedish sample and compared on formal and informal assistance received. Results Formal service utilization was higher in Swedish sample, whereas informal service use is lower overall. Individuals with characteristics associated with high placement risk received more formal and less informal assistance in Sweden relative to the United States. Discussion Differences suggest formal services supplement informal support in the United States and that formal and informal services are complementary in Sweden.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, Digital Systems and Services for Technology Supported Learning and Education are recognized as the key drivers to transform the way that individuals, groups and organizations “learn” and the way to “assess learning” in 21st Century. These transformations influence: Objectives - moving from acquiring new “knowledge” to developing new and relevant “competences”; Methods – moving from “classroom” based teaching to “context-aware” personalized learning; and Assessment – moving from “life-long” degrees and certifications to “on-demand” and “in-context” accreditation of qualifications. Within this context, promoting Open Access to Formal and Informal Learning, is currently a key issue in the public discourse and the global dialogue on Education, including Massive Open Online Courses (MOOCs) and Flipped School Classrooms. This volume on Digital Systems for Open Access to Formal and Informal Learning contributes to the international dialogue between researchers, technologists, practitioners and policy makers in Technology Supported Education and Learning. It addresses emerging issues related with both theory and practice, as well as, methods and technologies that can support Open Access to Formal and Informal Learning. In the twenty chapters contributed by international experts who are actively shaping the future of Educational Technology around the world, topics such as: - The evolution of University Open Courses in Transforming Learning - Supporting Open Access to Teaching and Learning of People with Disabilities - Assessing Student Learning in Online Courses - Digital Game-based Learning for School Education - Open Access to Virtual and Remote Labs for STEM Education - Teachers’ and Schools’ ICT Competence Profiling - Web-Based Education and Innovative Leadership in a K-12 International School Setting are presented. An in-depth blueprint of the promise, potential, and imminent future of the field, Digital Systems for Open Access to Formal and Informal Learning is necessary reading for researchers and practitioners, as well as, undergraduate and postgraduate students, in educational technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dynamic changes in ERP topographies can be conveniently analyzed by means of microstates, the so-called "atoms of thoughts", that represent brief periods of quasi-stable synchronized network activation. Comparing temporal microstate features such as on- and offset or duration between groups and conditions therefore allows a precise assessment of the timing of cognitive processes. So far, this has been achieved by assigning the individual time-varying ERP maps to spatially defined microstate templates obtained from clustering the grand mean data into predetermined numbers of topographies (microstate prototypes). Features obtained from these individual assignments were then statistically compared. This has the problem that the individual noise dilutes the match between individual topographies and templates leading to lower statistical power. We therefore propose a randomization-based procedure that works without assigning grand-mean microstate prototypes to individual data. In addition, we propose a new criterion to select the optimal number of microstate prototypes based on cross-validation across subjects. After a formal introduction, the method is applied to a sample data set of an N400 experiment and to simulated data with varying signal-to-noise ratios, and the results are compared to existing methods. In a first comparison with previously employed statistical procedures, the new method showed an increased robustness to noise, and a higher sensitivity for more subtle effects of microstate timing. We conclude that the proposed method is well-suited for the assessment of timing differences in cognitive processes. The increased statistical power allows identifying more subtle effects, which is particularly important in small and scarce patient populations.