964 resultados para field of solenoid
Resumo:
In this paper, we reflect about the broadening of the field of application of CRM from the business domain to a wider context of relationships in which the inclusion of non-profit making organizations seems natural. In particular, we focus on analyzing the suitability of adopting CRM processes by universities and higher educational institutions dedicated to e-learning. This is an issue that, in our opinion, has much potential but has received little attention in research so far.
Resumo:
This article describes the process of adapting Social Education studies to the European Higher Education Area undertaken by a team of the teaching staff at the University of Girona (Spain). The aim of the experience is to build a curriculum based on thecompetencies recognized as such by professionals in the field of social education in our region. The article specifies the development of the various phases, each involving the active participation of professionals and teaching staff from the universities. To conclude, main characteristics of the curriculum are highlighted
Resumo:
Context. MGRO J2019+37 is an unidentified extended source of very high energy gamma-rays originally reported by the Milagro Collaboration as the brightest TeV source in the Cygnus region. Its extended emission could be powered by either a single or several sources. The GeV pulsar AGL J2020.5+3653, discovered by AGILE and associated with PSR J2021+3651, could contribute to the emission from MGRO J2019+37. Our aim is to identify radio and near-infrared sources in the field of the extended TeV source MGRO J2019+37, and study potential counterparts to explain its emission. Methods: We surveyed a region of about 6 square degrees with the Giant Metrewave Radio Telescope (GMRT) at the frequency 610 MHz. We also observed the central square degree of this survey in the near-infrared Ks-band using the 3.5 m telescope in Calar Alto. Archival X-ray observations of some specific fields are included. VLBI observations of an interesting radio source were performed. We explored possible scenarios to produce the multi-TeV emission from MGRO J2019+37 and studied which of the sources could be the main particle accelerator. Results: We present a catalogue of 362 radio sources detected with the GMRT in the field of MGRO J2019+37, and the results of a cross-correlation of this catalog with one obtained at near-infrared wavelengths, which contains ∼3 × 105 sources, as well as with available X-ray observations of the region. Some peculiar sources inside the ∼1◦ uncertainty region of the TeV emission from MGRO J2019+37 are discussed in detail, including the pulsar PSR J2021+3651 and its pulsar wind nebula PWN G75.2+0.1, two new radio-jet sources, the Hii region Sh 2-104 containing two star clusters, and the radio source NVSS J202032+363158. We also find that the hadronic scenario is the most likely in case of a single accelerator, and discuss the possible contribution from the sources mentioned above. Conclusions: Although the radio and GeV pulsar PSR J2021+3651 / AGL J2020.5+3653 and its associated pulsar wind nebula PWN G75.2+0.1 can contribute to the emission from MGRO J2019+37, extrapolation of the GeV spectrum does not explain the detected multi-TeV flux. Other sources discussed here could contribute to the emission of the Milagro source
Resumo:
This thesis studies the properties and usability of operators called t-norms, t-conorms, uninorms, as well as many valued implications and equivalences. Into these operators, weights and a generalized mean are embedded for aggregation, and they are used for comparison tasks and for this reason they are referred to as comparison measures. The thesis illustrates how these operators can be weighted with a differential evolution and aggregated with a generalized mean, and the kinds of measures of comparison that can be achieved from this procedure. New operators suitable for comparison measures are suggested. These operators are combination measures based on the use of t-norms and t-conorms, the generalized 3_-uninorm and pseudo equivalence measures based on S-type implications. The empirical part of this thesis demonstrates how these new comparison measures work in the field of classification, for example, in the classification of medical data. The second application area is from the field of sports medicine and it represents an expert system for defining an athlete's aerobic and anaerobic thresholds. The core of this thesis offers definitions for comparison measures and illustrates that there is no actual difference in the results achieved in comparison tasks, by the use of comparison measures based on distance, versus comparison measures based on many valued logical structures. The approach has been highly practical in this thesis and all usage of the measures has been validated mainly by practical testing. In general, many different types of operators suitable for comparison tasks have been presented in fuzzy logic literature and there has been little or no experimental work with these operators.
Resumo:
This thesis is concerned with the philosophical grammar of certain psychiatric concepts, which play a central role in delineating the field of psychiatric work. The concepts studied are ‘psychosis’, ‘delusion’, ‘person’, ‘understanding’ and ‘incomprehensibility’. The purpose of this conceptual analysis is to provide a more perspicuous view of the logic of these concepts, how psychiatric work is constituted in relation to them, and what this tells us about the relationships between the conceptual and the empirical in psychiatric concepts. The method used in the thesis is indebted primarily to Ludwig Wittgenstein’s conception of philosophy, where we are urged to look at language uses in relation to practices in order to obtain a clearer overview of practices of interest; this will enable us to resolve the conceptual problems related to these practices. This questioning takes as its starting point the concept of psychosis, a central psychiatric concept during the twentieth century. The conceptual analysis of ‘psychosis’ shows that the concept is logically dependent on the concepts of ‘understanding’ and ‘person’. Following the lead found in this analysis, the logic of person-concepts in psychiatric discourse is analysed by a detailed textual analysis of a psychiatric journal article. The main finding is the ambiguous uses of ‘person’, enabling a specifically psychiatric form of concern in human affairs. The grammar of ‘understanding’ is then tackled from the opposite end, by exploring the logic of the concept of ‘incomprehensibility’. First, by studying the DSM-IV definition of delusion it is shown that its ambiguities boil down to the question of whether psychiatric practice is better accounted for in terms of the grammar of ‘incorrectness’ or ‘incomprehensibility’. Second, the grammar of ‘incomprehensibility’ is further focused on by introducing the distinction between positive and negative conceptions of ‘incomprehensibility’. The main finding is that this distinction has wide-ranging implications for our understanding of psychiatric concepts. Finally, some of the findings gained in these studies are ‘put into practice’ in studying the more practical question of the conceptual and ethical problems associated with the concept of ‘prodromal symptom of schizophrenia’ and the agenda of early detection and intervention in schizophrenia more generally.
Resumo:
In modern day organizations there are an increasing number of IT devices such as computers, mobile phones and printers. These devices can be located and maintained by using specialized IT management applications. Costs related to a single device accumulate from various sources and are normally categorized as direct costs like hardware costs and indirect costs such as labor costs. These costs can be saved in a configuration management database and presented to users using web based development tools such as ASP.NET. The overall costs of IT devices during their lifecycle can be ten times higher than the actual purchase price of the product and ability to define and reduce these costs can save organizations noticeable amount of money. This Master’s Thesis introduces the research field of IT management and defines a custom framework model based on Information Technology Infrastructure Library (ITIL) best practices which is designed to be implemented as part of an existing IT management application for defining and presenting IT costs.
Resumo:
This paper proposes a calibration method which can be utilized for the analysis of SEM images. The field of application of the developed method is a calculation of surface potential distribution of biased silicon edgeless detector. The suggested processing of the data collected by SEM consists of several stages and takes into account different aspects affecting the SEM image. The calibration method doesn’t pretend to be precise but at the same time it gives the basics of potential distribution when the different biasing voltages applied to the detector.
Resumo:
This work proposes a method of visualizing the trend of research in the field of ceramic membranes from 1999 to 2006. The presented approach involves identifying problems encountered during research in the field of ceramic membranes. Patents from US patent database and articles from Science Direct(& by ELSEVIER was analyzed for this work. The identification of problems was achieved with software Knowledgist which focuses on the semantic nature of a sentence to generate series of subject action object structures. The identified problems are classified into major research issues. This classification was used for the visualization of the intensity of research. The image produced gives the relation between the number of patents, with time and the major research issues. The identification of the most cited papers which strongly influence the research of the previously identified major issues in the given field was also carried out. The relations between these papers are presented using the metaphor of social network. The final result of this work are two figures, a diagram showing the change in the studied problems a specified period of time and a figure showing the relations between the major papers and groups of the problems
Resumo:
As it is known, a huge part of all commercially available membranes are prepared by immersion precipitation. This way is the primary way to get flat membranes. The advantages of immersion precipitation are: wide field of the polymers, which can be used (polymer must be soluble in a solvent or a solvent mixture) and ease of performing. The literature part of this work deals with phase inversion membrane preparation methods and casting parameters affecting membrane performance. Also some membrane types and materials are discussed. In the experimental part of this work 73 membrane samples were made with different casting parameters (polymer concentration in the casting solution and precipitation time) and tested for the retention and permeability. The results of these experiments are collected and combined into the figures and tables which are presented in this thesis. This work showed and confirmed connection between membrane performance and casting parameters (concentration of polymer in the casting solution and precipitation time).
Resumo:
In the field of observational methodology the observer is obviously a central figure, and close attention should be paid to the process through which he or she acquires, applies, and maintains the skills required. Basic training in how to apply the operational definitions of categories and the rules for coding, coupled with the opportunity to use the observation instrument in real-life situations, can have a positive effect in terms of the degree of agreement achieved when one evaluates intra- and inter-observer reliability. Several authors, including Arias, Argudo, & Alonso (2009) and Medina and Delgado (1999), have put forward proposals for the process of basic and applied training in this context. Reid y De Master (1982) focuses on the observer's performance and how to maintain the acquired skills, it being argued that periodic checks are needed after initial training because an observer may, over time, become less reliable due to the inherent complexity of category systems. The purpose of this subsequent training is to maintain acceptable levels of observer reliability. Various strategies can be used to this end, including providing feedback about those categories associated with a good reliability index, or offering re-training in how to apply those that yield lower indices. The aim of this study is to develop a performance-based index that is capable of assessing an observer's ability to produce reliable observations in conjunction with other observers.
Resumo:
The focus of this dissertation is to develop finite elements based on the absolute nodal coordinate formulation. The absolute nodal coordinate formulation is a nonlinear finite element formulation, which is introduced for special requirements in the field of flexible multibody dynamics. In this formulation, a special definition for the rotation of elements is employed to ensure the formulation will not suffer from singularities due to large rotations. The absolute nodal coordinate formulation can be used for analyzing the dynamics of beam, plate and shell type structures. The improvements of the formulation are mainly concentrated towards the description of transverse shear deformation. Additionally, the formulation is verified by using conventional iso-parametric solid finite element and geometrically exact beam theory. Previous claims about especially high eigenfrequencies are studied by introducing beam elements based on the absolute nodal coordinate formulation in the framework of the large rotation vector approach. Additionally, the same high eigenfrequency problem is studied by using constraints for transverse deformation. It was determined that the improvements for shear deformation in the transverse direction lead to clear improvements in computational efficiency. This was especially true when comparative stress must be defined, for example when using elasto-plastic material. Furthermore, the developed plate element can be used to avoid certain numerical problems, such as shear and curvature lockings. In addition, it was shown that when compared to conventional solid elements, or elements based on nonlinear beam theory, elements based on the absolute nodal coordinate formulation do not lead to an especially stiff system for the equations of motion.
Resumo:
The front end of innovation is regarded as one of the most important steps in building new software products or services, and the most significant benefits in software development can be achieved through improvements in the front end activities. Problems in the front end phase have an impact on customer dissatisfaction with delivered software, and on the effectiveness of the entire software development process. When these processes are improved, the likelihood of delivering high quality software and business success increases. This thesis highlights the challenges and problems related to the early phases of software development, and provides new methods and tools for improving performance in the front end activities of software development. The theoretical framework of this study comprises two fields of research. The first section belongs to the field of innovation management, and especially to the management of the early phases of the innovation process, i.e. the front end of innovation. The second section of the framework is closely linked to the processes of software engineering, especially to the early phases of the software development process, i.e. the practice of requirements engineering. Thus, this study extends the theoretical knowledge and discloses the differences and similarities in these two fields of research. In addition, this study opens up a new strand for academic discussion by connecting these research directions. Several qualitative business research methodologies have been utilized in the individual publications to solve the research questions. The theoretical and managerial contribution of the study can be divided into three areas: 1) processes and concepts, 2) challenges and development needs, and 3) means and methods for the front end activities of software development. First, the study discloses the difference and similarities between the concepts of the front end of innovation and requirements engineering, and proposes a new framework for managing the front end of the software innovation process, bringing business and innovation perspectives into software development. Furthermore, the study discloses managerial perceptions of the similarities and differences in the concept of the front end of innovation between the software industry and the traditional industrial sector. Second, the study highlights the challenges and development needs in the front end phase of software development, especially challenges in communication, such as linguistic problems, ineffective communication channels, a communication gap between users/customers and software developers, and participation of multiple persons in software development. Third, the study proposes new group methods for improving the front end activities of software development, especially customer need assessment, and the elicitation of software requirements.
Resumo:
A research about the quality of life of the older persons of a municipality starting from the analysis of the perceptions, evaluations and expectations related to concrete spheres of their lives (familiar characteristics, housing, health, nearby environment, activities, needs and dependencies and persons helping them to satisfy needs) is presented. Answers to 1988 questionnaires obtained from two representative samples of older per - sons living in private homes in the city have been analysed: persons over 65 years old and a specific one composed by a sub-sample of the general one, with persons over 75 years living alone. Data shaping contextual indicators related to housing have been analysed. The auto - nomy to drive and the evaluation of on’s health seem to be positive indicators to take in account while sutying quality of life at these ages. Sentinel indicators of the physical and relational conditions in the family and indicators of dependency are analysed as well. The evaluations of the older persons about their own life conditions are also analysed through psychosocial indicators connected to housing, nearby environmenent, their activities andincomes. The results obtained are applicable in order to improve the decision making process in social intervention programmes developed in the field of ageing aiming to take into account their perspectives
Resumo:
We live in an era defined by a wealth of open and readily available information, and the accelerated evolution of social, mobile and creative technologies. The provision of knowledge, once a primary role of educators, is now devolved to an immense web of free and readily accessible sources. Consequently, educators need to redefine their role not just ¿from sage on the stage to guide on the side¿ but, as more and more voices insist, as ¿designers for learning¿.The call for such a repositioning of educators is heard from leaders in the field of technology-enhanced learning (TEL) and resonates well with the growing culture of design-based research in Education. However, it is still struggling to find a foothold in educational practice. We contend that the root causes of this discrepancy are the lack of articulation of design practices and methods, along with a shortage of tools and representations to support such practices, a lack of a culture of teacher-as-designer among practitioners, and insufficient theoretical development.The Art and Science of Learning Design (ASLD) explores the frameworks, methods, and tools available for teachers, technologists and researchers interested in designing for learning Learning Design theories arising from findings of research are explored, drawing upon research and practitioner experiences. It then surveys current trends in the practices, methods, and methodologies of Learning Design. Highlighting the translation of theory into practice, this book showcases some of the latest tools that support the learning design process itself.
Resumo:
The formulation of the so-called law of rectilinear diameter for the determination of the critical volume of substances in the concluding decades of the nineteenth century became in a very useful and acceptably exact alternative tool for researchers in the field of critical phenomena. Its corresponding original expression, and even those of its early few modifications, were so mathematically simple that their use did not limit to exclusively contribute to remove the by then experimental obstacle for the estimating of this critical parameter, but also extended along several decades in the increasing applications of the principle of corresponding states.