864 resultados para pacs: data handling techniques
Resumo:
Many systems and applications are continuously producing events. These events are used to record the status of the system and trace the behaviors of the systems. By examining these events, system administrators can check the potential problems of these systems. If the temporal dynamics of the systems are further investigated, the underlying patterns can be discovered. The uncovered knowledge can be leveraged to predict the future system behaviors or to mitigate the potential risks of the systems. Moreover, the system administrators can utilize the temporal patterns to set up event management rules to make the system more intelligent. With the popularity of data mining techniques in recent years, these events grad- ually become more and more useful. Despite the recent advances of the data mining techniques, the application to system event mining is still in a rudimentary stage. Most of works are still focusing on episodes mining or frequent pattern discovering. These methods are unable to provide a brief yet comprehensible summary to reveal the valuable information from the high level perspective. Moreover, these methods provide little actionable knowledge to help the system administrators to better man- age the systems. To better make use of the recorded events, more practical techniques are required. From the perspective of data mining, three correlated directions are considered to be helpful for system management: (1) Provide concise yet comprehensive summaries about the running status of the systems; (2) Make the systems more intelligence and autonomous; (3) Effectively detect the abnormal behaviors of the systems. Due to the richness of the event logs, all these directions can be solved in the data-driven manner. And in this way, the robustness of the systems can be enhanced and the goal of autonomous management can be approached. This dissertation mainly focuses on the foregoing directions that leverage tem- poral mining techniques to facilitate system management. More specifically, three concrete topics will be discussed, including event, resource demand prediction, and streaming anomaly detection. Besides the theoretic contributions, the experimental evaluation will also be presented to demonstrate the effectiveness and efficacy of the corresponding solutions.
Resumo:
The meaning of work is a construct that has been studied more systematically from the 80s, through various approaches and in different occupational categories. This dissertation aims to describe and discuss the meanings of work for construction workers. This is an empirical study whose research supports herself in the Model Attributes Meaning of Work and its respective instrument for measuring the Meaning of Work Inventory (STI). The research involved 402 workers in the construction industry sector in the two capitals of the Brazilian Northeast, with a mean age of 35.8 years (SD = 11.4). To collect the data, besides the IST, the Working Conditions Questionnaire and sociodemographic data were also used. Data were organized and analyzed using the SPSS program. The study used data analysis techniques to Smallest Space Analisys (SSA), descriptive statistics, correlation and analysis of variance. There was evidence of validity of the STI which was structured into five types of value attributes (what work should be), and seven types of descriptive attributes (what is working). The results showed that the work has high centrality and profiling for participants after the family, the most important aspect in the lives of workers. Aspects of personal and economic growth were more emphasized in definition of what the work should be and responsibility and effort were characteristics that best described the reality of work.
Resumo:
Foundations support constitute one of the types of legal entities of private law forged with the purpose of supporting research projects, education and extension and institutional, scientific and technological development of Brazil. Observed as links of the relationship between company, university, and government, foundations supporting emerge in the Brazilian scene from the principle to establish an economic platform of development based on three pillars: science, technology and innovation – ST&I. In applied terms, these ones operate as tools of debureaucratisation making the management between public entities more agile, especially in the academic management in accordance with the approach of Triple Helix. From the exposed, the present study has as purpose understanding how the relation of Triple Helix intervenes in the fund-raising process of Brazilian foundations support. To understand the relations submitted, it was used the interaction models University-Company-Government recommended by Sábato and Botana (1968), the approach of the Triple Helix proposed by Etzkowitz and Leydesdorff (2000), as well as the perspective of the national innovation systems discussed by Freeman (1987, 1995), Nelson (1990, 1993) and Lundvall (1992). The research object of this study consists of 26 state foundations that support research associated with the National Council of the State Foundations of Supporting Research - CONFAP, as well as the 102 foundations in support of IES associated with the National Council of Foundations of Support for Institutions of Higher Education and Scientific and Technological Research – CONFIES, totaling 128 entities. As a research strategy, this study is considered as an applied research with a quantitative approach. Primary research data were collected using the e-mail Survey procedure. Seventy-five observations were collected, which corresponds to 58.59% of the research universe. It is considering the use of the bootstrap method in order to validate the use of the sample in the analysis of results. For data analysis, it was used descriptive statistics and multivariate data analysis techniques: the cluster analysis; the canonical correlation and the binary logistic regression. From the obtained canonical roots, the results indicated that the dependency relationship between the variables of relations (with the actors of the Triple Helix) and the financial resources invested in innovation projects is low, assuming the null hypothesis of this study, that the relations of the Triple Helix do not have interfered positively or negatively in raising funds for investments in innovation projects. On the other hand, the results obtained with the cluster analysis indicate that entities which have greater quantitative and financial amounts of projects are mostly large foundations (over 100 employees), which support up to five IES, publish management reports and use in their capital structure, greater financing of the public department. Finally, it is pertinent to note that the power of the classification of the logistic model obtained in this study showed high predictive capacity (80.0%) providing to the academic community replication in environments of similar analysis.
Resumo:
The integration between architectural design and structur al systems consi sts, in academic education, one of the main challenges to the architectural design education . Recent studies point to the relevance of the use of computational tools in academic settings as an important strategy for such integration. Although in recent yea rs teaching experience using BIM (BuildingInformationModeling) may be incorporated by the a rchitecture schools , notes the need for further didactic and pedagogical practices that promote the architectural design and structur al integration teaching. This pa per analyzes experiences developed within the UFRN and UFPB, seeking to identify tools, processes and products used, pointing limitations and potentials in subjects taught in these institutions. The research begins with a literature review on teaching BIM and related aspects to the integration of architectural design and stru c tur e . It has been used as data collection techniques in studio the direct observation, the use of questionnaires and interviews with students and teachers, and mixed method, qualitativ e and quantitative analysis . In UFRN, the scope of the Integrated Workshop as a compulsory subject in the curriculum, favors the integration of disciplines studied here as it allows teachers from different disciplines at the same project studio . Regarding the use of BIM form initial users, BIM modelers, able to extract quantitative and automatically speed up production, gaining in quality in the products, however learn the tool and design in parallel cause some difficulties. UFPB, lack of required courses o n BIM, generates lack of knowledge and confidence in using the tool and processes, by most students. Thus we see the need for greater efforts by school to adopt BIM skills and training. There is a greater need for both BIM concept, in order to promote BIM process and consequent better use of tools, and obsolete avoiding impairment of technology, merely a tool. It is considered the inclusion of specific subjects with more advanced BIM skills, through partnerships with engineering degrees and the promotion of trans disciplinary integration favoring the exchange of different cultures from the academic environment.
Resumo:
Soft skills and teamwork practices were identi ed as the main de ciencies of recent graduates in computer courses. This issue led to a realization of a qualitative research aimed at investigating the challenges faced by professors of those courses in conducting, monitoring and assessing collaborative software development projects. Di erent challenges were reported by teachers, including di culties in the assessment of students both in the collective and individual levels. In this context, a quantitative research was conducted with the aim to map soft skill of students to a set of indicators that can be extracted from software repositories using data mining techniques. These indicators are aimed at measuring soft skills, such as teamwork, leadership, problem solving and the pace of communication. Then, a peer assessment approach was applied in a collaborative software development course of the software engineering major at the Federal University of Rio Grande do Norte (UFRN). This research presents a correlation study between the students' soft skills scores and indicators based on mining software repositories. This study contributes: (i) in the presentation of professors' perception of the di culties and opportunities for improving management and monitoring practices in collaborative software development projects; (ii) in investigating relationships between soft skills and activities performed by students using software repositories; (iii) in encouraging the development of soft skills and the use of software repositories among software engineering students; (iv) in contributing to the state of the art of three important areas of software engineering, namely software engineering education, educational data mining and human aspects of software engineering.
Resumo:
Soft skills and teamwork practices were identi ed as the main de ciencies of recent graduates in computer courses. This issue led to a realization of a qualitative research aimed at investigating the challenges faced by professors of those courses in conducting, monitoring and assessing collaborative software development projects. Di erent challenges were reported by teachers, including di culties in the assessment of students both in the collective and individual levels. In this context, a quantitative research was conducted with the aim to map soft skill of students to a set of indicators that can be extracted from software repositories using data mining techniques. These indicators are aimed at measuring soft skills, such as teamwork, leadership, problem solving and the pace of communication. Then, a peer assessment approach was applied in a collaborative software development course of the software engineering major at the Federal University of Rio Grande do Norte (UFRN). This research presents a correlation study between the students' soft skills scores and indicators based on mining software repositories. This study contributes: (i) in the presentation of professors' perception of the di culties and opportunities for improving management and monitoring practices in collaborative software development projects; (ii) in investigating relationships between soft skills and activities performed by students using software repositories; (iii) in encouraging the development of soft skills and the use of software repositories among software engineering students; (iv) in contributing to the state of the art of three important areas of software engineering, namely software engineering education, educational data mining and human aspects of software engineering.
Resumo:
The ability for the citizens of a nation to determine their own representation has long been regarded as one of the most critical objectives of any electoral system. Without having the assurance of equality in representation, the fundamental nature and operation of the political system is severely undermined. Given the centuries of institutional reforms and population changes in the American system, Congressional Redistricting stands as an institution whereby this promise of effective representation can either be fulfilled or denied. The broad set of processes that encapsulate Congres- sional Redistricting have been discussed, experimented, and modified to achieve clear objectives and have long been understood to be important. Questions remain about how the dynamics which link all of these processes operate and what impact the real- ities of Congressional Redistricting hold for representation in the American system. This dissertation examines three aspects of how Congressional Redistricting in the Untied States operates in accordance with the principle of “One Person, One Vote.” By utilizing data and data analysis techniques of Geographic Information Systems (GIS), this dissertation seeks to address how Congressional Redistricting impacts the principle of one person, one vote from the standpoint of legislator accountability, redistricting institutions, and the promise of effective minority representation.
Resumo:
In recent years, most low and middle-income countries, have adopted different approaches to universal health coverage (UHC), to ensure equity and financial risk protection in accessing essential healthcare services. UHC-related policies and delivery strategies are largely based on existing healthcare systems, a result of gradual development (based on local factors and priorities). Most countries have emphasized on health financing, and human resources for health (HRH) reform policies, based on good practices of several healthcare plans to deliver UHC for their population.
Health financing and labor market frameworks were used, to understand health financing, HRH dynamics, and to analyze key health policies implemented over the past decade in Kenya’s effort to achieve UHC. Through the understanding, policy options are proposed to Kenya; analyzing, and generating lessons from health financing, and HRH reforms experiences in China. Data was collected using mixed methods approach, utilizing both quantitative (documents and literature review), and qualitative (in-depth interviews) data collection techniques.
The problems in Kenya are substantial: high levels of out-of-pocket health expenditure, slow progress in expanding health insurance among informal sector workers, inefficiencies in pulling of health are revenues, inadequate deployed HRH, maldistribution of HRH, and inadequate quality measures in training health worker. The government has identified the critical role of strengthening primary health care and the National Hospital Insurance Fund (NHIF) in Kenya’s move towards UHC. Strengthening primary health care requires; re-defining the role of hospitals, and health insurance schemes, and training, deploying and retaining primary care professionals according to the health needs of the population; concepts not emphasized in Kenya’s healthcare reforms or programs design. Kenya’s top leadership commitment is urgently needed for tougher reforms implementation, and important lessons from China’s extensive health reforms in the past decade are beneficial. Key lessons from China include health insurance expansion through rigorous research, monitoring, and evaluation, substantially increasing government health expenditure, innovative primary healthcare strengthening, designing, and implementing health policy reforms that are responsive to the population, and regional approaches to strengthening HRH.
Resumo:
Advanced Placement is a series of courses and tests designed to determine mastery over introductory college material. It has become part of the American educational system. The changing conception of AP was examined using critical theory to determine what led to a view of continual success. The study utilized David Armstrong’s variation of Michel Foucault’s critical theory to construct an analytical framework. Black and Ubbes’ data gathering techniques and Braun and Clark’s data analysis were utilized as the analytical framework. Data included 1135 documents: 641 journal articles, 421 newspaper articles and 82 government documents. The study revealed three historical ruptures correlated to three themes containing subthemes. The first rupture was the Sputnik launch in 1958. Its correlated theme was AP leading to school reform with subthemes of AP as reform for able students and AP’s gaining of acceptance from secondary schools and higher education. The second rupture was the Nation at Risk report published in 1983. Its correlated theme was AP’s shift in emphasis from the exam to the course with the subthemes of AP as a course, a shift in AP’s target population, using AP courses to promote equity, and AP courses modifying curricula. The passage of the No Child Left Behind Act of 2001 was the third rupture. Its correlated theme was AP as a means to narrow the achievement gap with the subthemes of AP as a college preparatory program and the shifting of AP to an open access program. The themes revealed a perception that progressively integrated the program into American education. The AP program changed emphasis from tests to curriculum, and is seen as the nation’s premier academic program to promote reform and prepare students for college. It has become a major source of income for the College Board. In effect, AP has become an agent of privatization, spurring other private entities into competition for government funding. The change and growth of the program over the past 57 years resulted in a deep integration into American education. As such the program remains an intrinsic part of the system and continues to evolve within American education.
Resumo:
Las etapas del cambio fonético-fonológico han sido descritas desde hace décadas, especialmente desde un punto de vista articulatorio y casi siempre partiendo de los testimonios escritos de que se podía disponer. No obstante, recientemente han ido surgiendo nuevas teorías que defienden que el cambio puede ser explicado a través del estudio de la variación y los procesos fonéticos propios del habla actual, puesto que ambos están relacionados con fenómenos de hipo (e hiper) articulación y, a la postre, de coarticulación. Una de ellas es la Fonología Evolutiva (Blevins 2004), aun cuando no ofrece una explicación satisfactoria para la difusión del cambio. En este estudio, se ha recurrido a estas teorías para esclarecer las causas de la evolución de dos contextos de yod segunda: /nj/ y /lj/, que llevaron a la fonologización de // y //, en un primer estadio de la historia del español.
Resumo:
Mathematical models are increasingly used in environmental science thus increasing the importance of uncertainty and sensitivity analyses. In the present study, an iterative parameter estimation and identifiability analysis methodology is applied to an atmospheric model – the Operational Street Pollution Model (OSPMr). To assess the predictive validity of the model, the data is split into an estimation and a prediction data set using two data splitting approaches and data preparation techniques (clustering and outlier detection) are analysed. The sensitivity analysis, being part of the identifiability analysis, showed that some model parameters were significantly more sensitive than others. The application of the determined optimal parameter values was shown to succesfully equilibrate the model biases among the individual streets and species. It was as well shown that the frequentist approach applied for the uncertainty calculations underestimated the parameter uncertainties. The model parameter uncertainty was qualitatively assessed to be significant, and reduction strategies were identified.
Resumo:
Il existe désormais une grande variété de lentilles panoramiques disponibles sur le marché dont certaines présentant des caractéristiques étonnantes. Faisant partie de cette dernière catégorie, les lentilles Panomorphes sont des lentilles panoramiques anamorphiques dont le profil de distorsion est fortement non-uniforme, ce qui cause la présence de zones de grandissement augmenté dans le champ de vue. Dans un contexte de robotique mobile, ces particularités peuvent être exploitées dans des systèmes stéréoscopiques pour la reconstruction 3D d’objets d’intérêt qui permettent à la fois une bonne connaissance de l’environnement, mais également l’accès à des détails plus fins en raison des zones de grandissement augmenté. Cependant, à cause de leur complexité, ces lentilles sont difficiles à calibrer et, à notre connaissance, aucune étude n’a réellement été menée à ce propos. L’objectif principal de cette thèse est la conception, l’élaboration et l’évaluation des performances de systèmes stéréoscopiques Panomorphes. Le calibrage a été effectué à l’aide d’une technique établie utilisant des cibles planes et d’une boîte à outils de calibrage dont l’usage est répandu. De plus, des techniques mathématiques nouvelles visant à rétablir la symétrie de révolution dans l’image (cercle) et à uniformiser la longueur focale (cercle uniforme) ont été développées pour voir s’il était possible d’ainsi faciliter le calibrage. Dans un premier temps, le champ de vue a été divisé en zones à l’intérieur desquelles la longueur focale instantanée varie peu et le calibrage a été effectué pour chacune d’entre elles. Puis, le calibrage général des systèmes a aussi été réalisé pour tout le champ de vue simultanément. Les résultats ont montré que la technique de calibrage par zone ne produit pas de gain significatif quant à la qualité des reconstructions 3D d’objet d’intérêt par rapport au calibrage général. Cependant, l’étude de cette nouvelle approche a permis de réaliser une évaluation des performances des systèmes stéréoscopiques Panomorphes sur tout le champ de vue et de montrer qu’il est possible d’effectuer des reconstructions 3D de qualité dans toutes les zones. De plus, la technique mathématique du cercle a produit des résultats de reconstructions 3D en général équivalents à l’utilisation des coordonnées originales. Puisqu’il existe des outils de calibrage qui, contrairement à celui utilisé dans ce travail, ne disposent que d’un seul degré de liberté sur la longueur focale, cette technique pourrait rendre possible le calibrage de lentilles Panomorphes à l’aide de ceux-ci. Finalement, certaines conclusions ont pu être dégagées quant aux facteurs déterminants influençant la qualité de la reconstruction 3D à l’aide de systèmes stéréoscopiques Panomorphes et aux caractéristiques à privilégier dans le choix des lentilles. La difficulté à calibrer les optiques Panomorphes en laboratoire a mené à l’élaboration d’une technique de calibrage virtuel utilisant un logiciel de conception optique et une boîte à outils de calibrage. Cette approche a permis d’effectuer des simulations en lien avec l’impact des conditions d’opération sur les paramètres de calibrage et avec l’effet des conditions de calibrage sur la qualité de la reconstruction. Des expérimentations de ce type sont pratiquement impossibles à réaliser en laboratoire mais représentent un intérêt certain pour les utilisateurs. Le calibrage virtuel d’une lentille traditionnelle a aussi montré que l’erreur de reprojection moyenne, couramment utilisée comme façon d’évaluer la qualité d’un calibrage, n’est pas nécessairement un indicateur fiable de la qualité de la reconstruction 3D. Il est alors nécessaire de disposer de données supplémentaires pour juger adéquatement de la qualité d’un calibrage.
Resumo:
Relatório Final de Estágio apresentado à Escola Superior de Dança, com vista à obtenção do grau de Mestre em Ensino de Dança.
Resumo:
Recent advances in the massively parallel computational abilities of graphical processing units (GPUs) have increased their use for general purpose computation, as companies look to take advantage of big data processing techniques. This has given rise to the potential for malicious software targeting GPUs, which is of interest to forensic investigators examining the operation of software. The ability to carry out reverse-engineering of software is of great importance within the security and forensics elds, particularly when investigating malicious software or carrying out forensic analysis following a successful security breach. Due to the complexity of the Nvidia CUDA (Compute Uni ed Device Architecture) framework, it is not clear how best to approach the reverse engineering of a piece of CUDA software. We carry out a review of the di erent binary output formats which may be encountered from the CUDA compiler, and their implications on reverse engineering. We then demonstrate the process of carrying out disassembly of an example CUDA application, to establish the various techniques available to forensic investigators carrying out black-box disassembly and reverse engineering of CUDA binaries. We show that the Nvidia compiler, using default settings, leaks useful information. Finally, we demonstrate techniques to better protect intellectual property in CUDA algorithm implementations from reverse engineering.
Resumo:
Mestrado em Fisioterapia