999 resultados para WORK METHODOLOGIES


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes 2 alternative methodologies for the determination of selected aldehydes (formaldehyde, acetaldehyde, propionaldehyde, acrolein, and benzaldehyde) by capillary electrophoresis (CE), the first approach is based on the formation of aldehyde-bisulfite adducts and employs free solution CE with reversed electroosmotic flow and indirect detection, using 10 mmol/L 3,5-dinitrobenzoic acid (pH 4.5) containing 0.2 mmol/L cetyltrimethylammonium bromide as the electrolyte. This novel methodology showed a fairly good sensitivity to concentration, with detection limits with respect to a single aldehyde on the order of 10-40 mu g/L, a reasonable analysis time (separation was achieved in <8 min), and no need for sample manipulation. A second approach was proposed in which 2,4-dinitrophenylhydrazine derivatives of the aldehydes were detected in a micellar electrolyte medium (20 mmol/L berate buffer containing 50 mmol/L sodium dodecyl sulfate and 15 mmol/L beta-cyclodextrin). This latter methodology included a laborious sample preconcentration step and showed much poorer sensitivity (0.5-2 mg/L detection limit, with respect to a single aldehyde), despite the use of sodium chloride to promote sample stacking. Both methodologies proved adequate to evaluate aldehyde levels in vehicular emissions. Samples from the tailpipe exhaust of a passenger car vehicle without a catalytic converter and operated with an ethanol-based fuel were collected and analyzed; the results showed high levels of formaldehyde and acetaldehyde (0.41-6.1 ppm, v/v). The concentrations estimated by the 2 methodologies, which were not in good agreement, suggest the possibility of striking differences in sample collection efficiency, which was not the concern of this work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

According to the current international guidelines concerning environmental problems, it is necessary to evaluate and to know the indoor radon levels, specially since most of the natural radiation dose to man comes from radon gas and its progeny. Several countries have established National Institutions and National Programs for the study of radon and its connection with lung cancer risk and public health. The aim of this work is to present the indoor radon measurements and the detection methods used for different regions of Latin America (LA) in countries such as Argentina, Brazil, Ecuador, Mexico, Peru and Venezuela. This study shows that the passive radon devices based on alpha particle nuclear track methodology (NTM) is one of the more generalized methods in LA for long term indoor radon measurements, CR-39, LR-115 and Makrofol being the more commonly used detector materials. The participating institutions and the radon level measurements in the different countries are presented in this contribution. (C) 2001 Elsevier B.V. Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives. This paper attempts to provide critical perspectives on common in vitro research methodologies, including shear bond testing, wear testing, and load-to-failure tests. Origins of interest in high-quality laboratory data is reviewed, in vitro data is categorized into property and simulation protocols, and two approaches are suggested for establishing clinical validity. It is hoped that these insights will encourage further progress toward development of in vitro tests that are validated against clinical performance and/or by producing clinically validated failure or damage mechanisms.Materials and methods. Published shear and tensile bond data (macro and micro) is examined in light of published finite element analyses (FEA). This data is subjected to a Weibull scaling analysis to ascertain whether scaling is consistent with failure from the bonded interface or not. Wear tests results are presented in light of the damage mechanism(s) operating. Quantitative wear data is re-examined as being dependent upon contact pressure. Load-to-failure test results are re-analyzed by calculating contact stresses at failure for 119 tests from 54 publications over more than 25 years.Results. FEA analyses and reported failure modes (adhesive, mixed, cohesive) are consistent with failure not involving interfacial "shear stresses" as calculated in published work. Weibull scaling clearly suggests failure involving external surfaces of specimens, not interfacial origins. Contact stresses (pressures) are clearly an important variable in wear testing and are not well-controlled in published work. Load-to-failure tests create damage not seen clinically due to excessively high contact stresses. Most contact stresses in the 119 tests examined were calculated to be between 1000 MPa and 5000 MPa, whereas clinical contact stresses at wear facets have been measured not to exceed 40 MPa.Conclusions. Our community can do a much better job of designing in vitro tests that more closely simulate clinical conditions, especially when contact is involved. Journals are encouraged to thoughtfully consider a ban on publishing papers using bond tests and load-to-failure methods that are seriously flawed and have no clinical relevance. (C) 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The tactile cartography is an area of Cartography that aims the development of methodologies and didactical material to work cartographic concepts with blind and low vision people. The main aim of this article is to present the experience of Tactile Cartography Research Group from Sao Paulo State University (UNESP), including some didactical material and courses for teachers using the System MAPAVOX. The System MAPAVOX is software developed by our research group in a partnership with Federal University of Rio de Janeiro (UFRJ) that integrates maps and models with a voice synthesizer, sound emission, texts, images and video visualizing for computers. Our research methodology is based in authors that have in the students the centre of didactical activity such as Ochaita and Espinosa in [1], which developed studies related to blind children's literacy. According to Almeida the child's drawing is, thus, a system of representation. It isn't a copy of objects, but interpretation of that which is real, done by the child in graphic language[2]. In the proposed activities with blind and low vision students they are prepared to interpret reality and represent it by adopting concepts of graphic language learned. To start the cartographic initialization it is necessary to use personal and quotidian references, for example the classroom tactile model or map, to include concepts in generalization and scale concerning to their space of life. During these years many case studies were developed with blind and low vision students from Special School for Hearing Impaired and Visually Impaired in Araras and Rio Claro, Sao Paulo - Brazil. The most part of these experiences and others from Brazil and Chile are presented in [3]. Tactile material and MAPAVOX facilities are analysed by students and teachers who contribute with suggestions to reformulate and adapt them to their sensibility and necessity. Since 2005 we offer courses in Tactile Cartography to prepare teachers from elementary school in the manipulation of didactical material and attending students with special educational needs in regular classroom. There were 6 classroom and blended courses offered for 184 teachers from public schools in this region of the Sao Paulo state. As conclusion we can observe that methodological procedures centred in the blind and low vision students are successful in their spatial orientation if use didactical material from places or objects with which they have significant experience. During the applying of courses for teachers we could see that interdisciplinary groups can find creative cartographic alternatives more easily. We observed too that the best results in methodological procedures were those who provided concreteness to abstract concepts using daily experiences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Food handlers have a very important role in preventing food contamination during its preparation and distribution. This responsibility is even greater in hospitals, since a large number of patients have low immunity and consequently food contamination by pathogenic bacteria could be particularly harmful. Therefore, a good working environment and periodic training should be provided to food handlers by upper management. Methods This study is qualitative research by means of focus group and thematic content analysis methodologies to examine, in detail, the statements by food handlers working in the milk and specific-diet kitchens in a hospital to understand the problems they face in the workplace. Results We found that food handlers are aware of the role they play in restoring patients' health; they consider it important to offer a good-quality diet. However, according to their perceptions, a number of difficulties prevent them from reaching this aim. These include: upper management not prioritizing human and material resources to the dietetic services when making resource allocation decisions; a perception that upper management considers their work to be of lesser importance; delayed overtime payments; lack of periodic training; managers lacking administrative skills; insufficient dietitian staff assistants, leading to overwork, at the same time as there is an excess of dietitians; unhealthy environmental working conditions – high temperature, high humidity, loud and constant noise level, poor ventilation; lack of food, and kitchen utensils and equipment; and relationship conflicts with chief dieticians and co-workers. Conclusion From these findings, improvement in staff motivation could be achieved by considering non-financial incentives, such as improvement in working conditions and showing appreciation and respect through supervision, training and performance appraisal. Management action, such as investments in intermediary management so that managers have the capacity to provide supportive supervision, as well as better use of performance appraisal and access to training, may help overcome the identified problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visual correspondence is a key computer vision task that aims at identifying projections of the same 3D point into images taken either from different viewpoints or at different time instances. This task has been the subject of intense research activities in the last years in scenarios such as object recognition, motion detection, stereo vision, pattern matching, image registration. The approaches proposed in literature typically aim at improving the state of the art by increasing the reliability, the accuracy or the computational efficiency of visual correspondence algorithms. The research work carried out during the Ph.D. course and presented in this dissertation deals with three specific visual correspondence problems: fast pattern matching, stereo correspondence and robust image matching. The dissertation presents original contributions to the theory of visual correspondence, as well as applications dealing with 3D reconstruction and multi-view video surveillance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The running innovation processes of the microwave transistor technologies, used in the implementation of microwave circuits, have to be supported by the study and development of proper design methodologies which, depending on the applications, will fully exploit the technology potentialities. After the choice of the technology to be used in the particular application, the circuit designer has few degrees of freedom when carrying out his design; in the most cases, due to the technological constrains, all the foundries develop and provide customized processes optimized for a specific performance such as power, low-noise, linearity, broadband etc. For these reasons circuit design is always a “compromise”, an investigation for the best solution to reach a trade off between the desired performances. This approach becomes crucial in the design of microwave systems to be used in satellite applications; the tight space constraints impose to reach the best performances under proper electrical and thermal de-rated conditions, respect to the maximum ratings provided by the used technology, in order to ensure adequate levels of reliability. In particular this work is about one of the most critical components in the front-end of a satellite antenna, the High Power Amplifier (HPA). The HPA is the main power dissipation source and so the element which mostly engrave on space, weight and cost of telecommunication apparatus; it is clear from the above reasons that design strategies addressing optimization of power density, efficiency and reliability are of major concern. Many transactions and publications demonstrate different methods for the design of power amplifiers, highlighting the availability to obtain very good levels of output power, efficiency and gain. Starting from existing knowledge, the target of the research activities summarized in this dissertation was to develop a design methodology capable optimize power amplifier performances complying all the constraints imposed by the space applications, tacking into account the thermal behaviour in the same manner of the power and the efficiency. After a reminder of the existing theories about the power amplifier design, in the first section of this work, the effectiveness of the methodology based on the accurate control of the dynamic Load Line and her shaping will be described, explaining all steps in the design of two different kinds of high power amplifiers. Considering the trade-off between the main performances and reliability issues as the target of the design activity, we will demonstrate that the expected results could be obtained working on the characteristics of the Load Line at the intrinsic terminals of the selected active device. The methodology proposed in this first part is based on the assumption that designer has the availability of an accurate electrical model of the device; the variety of publications about this argument demonstrates that it is so difficult to carry out a CAD model capable to taking into account all the non-ideal phenomena which occur when the amplifier operates at such high frequency and power levels. For that, especially for the emerging technology of Gallium Nitride (GaN), in the second section a new approach for power amplifier design will be described, basing on the experimental characterization of the intrinsic Load Line by means of a low frequency high power measurements bench. Thanks to the possibility to develop my Ph.D. in an academic spin-off, MEC – Microwave Electronics for Communications, the results of this activity has been applied to important research programs requested by space agencies, with the aim support the technological transfer from universities to industrial world and to promote a science-based entrepreneurship. For these reasons the proposed design methodology will be explained basing on many experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The following Ph.D work was mainly focused on catalysis, as a key technology, to achieve the objectives of sustainable (green) chemistry. After introducing the concepts of sustainable (green) chemistry and an assessment of new sustainable chemical technologies, the relationship between catalysis and sustainable (green) chemistry was briefly discussed and illustrated via an analysis of some selected and relevant examples. Afterwards, as a continuation of the ongoing interest in Dr. Marco Bandini’s group on organometallic and organocatalytic processes, I addressed my efforts to the design and development of novel catalytic green methodologies for the synthesis of enantiomerically enriched molecules. In the first two projects the attention was focused on the employment of solid supports to carry out reactions that still remain a prerogative of omogeneous catalysis. Firstly, particular emphasis was addressed to the discovery of catalytic enantioselective variants of nitroaldol condensation (commonly termed Henry reaction), using a complex consisting in a polyethylene supported diamino thiopene (DATx) ligands and copper as active species. In the second project, a new class of electrochemically modified surfaces with DATx palladium complexes was presented. The DATx-graphite system proved to be efficient in promoting the Suzuki reaction. Moreover, in collaboration with Prof. Wolf at the University of British Columbia (Vancouver), cyclic voltammetry studies were reported. This study disclosed new opportunities for carbon–carbon forming processes by using heterogeneous, electrodeposited catalyst films. A straightforward metal-free catalysis allowed the exploration around the world of organocatalysis. In fact, three different and novel methodologies, using Cinchona, Guanidine and Phosphine derivatives, were envisioned in the three following projects. An interesting variant of nitroaldol condensation with simple trifluoromethyl ketones and also their application in a non-conventional activation of indolyl cores by Friedel-Crafts-functionalization, led to two novel synthetic protocols. These approaches allowed the preparation of synthetically useful trifluoromethyl derivatives bearing quaternary stereocenters. Lastly, in the sixth project the first γ-alkylation of allenoates with conjugated carbonyl compounds was envisioned. In the last part of this Ph.D thesis bases on an extra-ordinary collaboration with Prof. Balzani and Prof. Gigli, I was involved in the synthesis and characterization of a new type of heteroleptic cyclometaled-Ir(III) complexes, bearing bis-oxazolines (BOXs) as ancillary ligands. The new heteroleptic complexes were fully characterized and in order to examine the electroluminescent properties of FIrBOX(CH2), an Organic Light Emitting Device was realized.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation is a report on a collaborative project between the Computer Science and the Humanities Departments to develop case studies that focus on issues of communication in the workplace, and the results of their use in the classroom. My argument is that case study teaching simulates real-world experience in a meaningful way, essentially developing a teachable way of developing phronesis, the reasoned capacity to act for the good in public. In addition, it can be read as a "how-to" guide for educators who may wish to construct their own case studies. To that end, I have included a discussion of the ethnographic methodologies employed, and how it was adapted to our more pragmatic ends. Finally, I present my overarching argument for a new appraisal of the concept of techné. This reappraisal emphasizes its productive activity, poiesis, rather than focusing on its knowledge as has been the case in the past. I propose that focusing on the telos, the end outside the production, contributes to the diminishment, if not complete foreclosure, of a rich concept of techné.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research and professional practices have the joint aim of re-structuring the preconceived notions of reality. They both want to gain the understanding about social reality. Social workers use their professional competence in order to grasp the reality of their clients, while researchers’ pursuit is to open the secrecies of the research material. Development and research are now so intertwined and inherent in almost all professional practices that making distinctions between practising, developing and researching has become difficult and in many aspects irrelevant. Moving towards research-based practices is possible and it is easily applied within the framework of the qualitative research approach (Dominelli 2005, 235; Humphries 2005, 280). Social work can be understood as acts and speech acts crisscrossing between social workers and clients. When trying to catch the verbal and non-verbal hints of each others’ behaviour, the actors have to do a lot of interpretations in a more or less uncertain mental landscape. Our point of departure is the idea that the study of social work practices requires tools which effectively reveal the internal complexity of social work (see, for example, Adams & Dominelli & Payne 2005, 294 – 295). The boom of qualitative research methodologies in recent decades is associated with much profound the rupture in humanities, which is called the linguistic turn (Rorty 1967). The idea that language is not transparently mediating our perceptions and thoughts about reality, but on the contrary it constitutes it was new and even confusing to many social scientists. Nowadays we have got used to read research reports which have applied different branches of discursive analyses or narratologic or semiotic approaches. Although differences are sophisticated between those orientations they share the idea of the predominance of language. Despite the lively research work of today’s social work and the research-minded atmosphere of social work practice, semiotics has rarely applied in social work research. However, social work as a communicative practice concerns symbols, metaphors and all kinds of the representative structures of language. Those items are at the core of semiotics, the science of signs, and the science which examines people using signs in their mutual interaction and their endeavours to make the sense of the world they live in, their semiosis. When thinking of the practice of social work and doing the research of it, a number of interpretational levels ought to be passed before reaching the research phase in social work. First of all, social workers have to interpret their clients’ situations, which will be recorded in the files. In some very rare cases those past situations will be reflected in discussions or perhaps interviews or put under the scrutiny of some researcher in the future. Each and every new observation adds its own flavour to the mixture of meanings. Social workers have combined their observations with previous experience and professional knowledge, furthermore, the situation on hand also influences the reactions. In addition, the interpretations made by social workers over the course of their daily working routines are never limited to being part of the personal process of the social worker, but are also always inherently cultural. The work aiming at social change is defined by the presence of an initial situation, a specific goal, and the means and ways of achieving it, which are – or which should be – agreed upon by the social worker and the client in situation which is unique and at the same time socially-driven. Because of the inherent plot-based nature of social work, the practices related to it can be analysed as stories (see Dominelli 2005, 234), given, of course, that they are signifying and told by someone. The research of the practices is concentrating on impressions, perceptions, judgements, accounts, documents etc. All these multifarious elements can be scrutinized as textual corpora, but not whatever textual material. In semiotic analysis, the material studied is characterised as verbal or textual and loaded with meanings. We present a contribution of research methodology, semiotic analysis, which has to our mind at least implicitly references to the social work practices. Our examples of semiotic interpretation have been picked up from our dissertations (Laine 2005; Saurama 2002). The data are official documents from the archives of a child welfare agency and transcriptions of the interviews of shelter employees. These data can be defined as stories told by the social workers of what they have seen and felt. The official documents present only fragmentations and they are often written in passive form. (Saurama 2002, 70.) The interviews carried out in the shelters can be described as stories where the narrators are more familiar and known. The material is characterised by the interaction between the interviewer and interviewee. The levels of the story and the telling of the story become apparent when interviews or documents are examined with the use of semiotic tools. The roots of semiotic interpretation can be found in three different branches; the American pragmatism, Saussurean linguistics in Paris and the so called formalism in Moscow and Tartu; however in this paper we are engaged with the so called Parisian School of semiology which prominent figure was A. J. Greimas. The Finnish sociologists Pekka Sulkunen and Jukka Törrönen (1997a; 1997b) have further developed the ideas of Greimas in their studies on socio-semiotics, and we lean on their ideas. In semiotics social reality is conceived as a relationship between subjects, observations, and interpretations and it is seen mediated by natural language which is the most common sign system among human beings (Mounin 1985; de Saussure 2006; Sebeok 1986). Signification is an act of associating an abstract context (signified) to some physical instrument (signifier). These two elements together form the basic concept, the “sign”, which never constitutes any kind of meaning alone. The meaning will be comprised in a distinction process where signs are being related to other signs. In this chain of signs, the meaning becomes diverged from reality. (Greimas 1980, 28; Potter 1996, 70; de Saussure 2006, 46-48.) One interpretative tool is to think of speech as a surface under which deep structures – i.e. values and norms – exist (Greimas & Courtes 1982; Greimas 1987). To our mind semiotics is very much about playing with two different levels of text: the syntagmatic surface which is more or less faithful to the grammar, and the paradigmatic, semantic structure of values and norms hidden in the deeper meanings of interpretations. Semiotic analysis deals precisely with the level of meaning which exists under the surface, but the only way to reach those meanings is through the textual level, the written or spoken text. That is why the tools are needed. In our studies, we have used the semiotic square and the actant analysis. The former is based on the distinctions and the categorisations of meanings, and the latter on opening the plotting of narratives in order to reach the value structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the beginning of the 90s, ontology development was similar to an art: ontology developers did not have clear guidelines on how to build ontologies but only some design criteria to be followed. Work on principles, methods and methodologies, together with supporting technologies and languages, made ontology development become an engineering discipline, the so-called Ontology Engineering. Ontology Engineering refers to the set of activities that concern the ontology development process and the ontology life cycle, the methods and methodologies for building ontologies, and the tool suites and languages that support them. Thanks to the work done in the Ontology Engineering field, the development of ontologies within and between teams has increased and improved, as well as the possibility of reusing ontologies in other developments and in final applications. Currently, ontologies are widely used in (a) Knowledge Engineering, Artificial Intelligence and Computer Science, (b) applications related to knowledge management, natural language processing, e-commerce, intelligent information integration, information retrieval, database design and integration, bio-informatics, education, and (c) the Semantic Web, the Semantic Grid, and the Linked Data initiative. In this paper, we provide an overview of Ontology Engineering, mentioning the most outstanding and used methodologies, languages, and tools for building ontologies. In addition, we include some words on how all these elements can be used in the Linked Data initiative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

“Teamwork” is one of the abilities most valued by employers. In [16] we describe the process of adapting to the ECTS methodologies (for ongoing assessment), a course in computer programming for students in a technical degree (Marine Engineering, UPM) not specifically dedicated to computing. As a further step in this process we have emphasized cooperative learning. For this, the students were paired and the work of each pair was evaluated via surprise tests taken and graded jointly, and constituting a substantial part of the final grade. Here we document this experience, discussing methodological aspects, describing indicators for measuring the impact of these methodologies on the educational experience, and reporting on the students’ opinion of it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fully 3D iterative image reconstruction algorithm has been developed for high-resolution PET cameras composed of pixelated scintillator crystal arrays and rotating planar detectors, based on the ordered subsets approach. The associated system matrix is precalculated with Monte Carlo methods that incorporate physical effects not included in analytical models, such as positron range effects and interaction of the incident gammas with the scintillator material. Custom Monte Carlo methodologies have been developed and optimized for modelling of system matrices for fast iterative image reconstruction adapted to specific scanner geometries, without redundant calculations. According to the methodology proposed here, only one-eighth of the voxels within two central transaxial slices need to be modelled in detail. The rest of the system matrix elements can be obtained with the aid of axial symmetries and redundancies, as well as in-plane symmetries within transaxial slices. Sparse matrix techniques for the non-zero system matrix elements are employed, allowing for fast execution of the image reconstruction process. This 3D image reconstruction scheme has been compared in terms of image quality to a 2D fast implementation of the OSEM algorithm combined with Fourier rebinning approaches. This work confirms the superiority of fully 3D OSEM in terms of spatial resolution, contrast recovery and noise reduction as compared to conventional 2D approaches based on rebinning schemes. At the same time it demonstrates that fully 3D methodologies can be efficiently applied to the image reconstruction problem for high-resolution rotational PET cameras by applying accurate pre-calculated system models and taking advantage of the system's symmetries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows the results of a research aimed to formulate a general model for supporting the implementation and management of an urban road pricing scheme. After a preliminary work, to define the state of the art in the field of sustainable urban mobility strategies, the problem has been theoretically set up in terms of transport economy, introducing the external costs’ concept duly translated into the principle of pricing for the use of public infrastructures. The research is based on the definition of a set of direct and indirect indicators to qualify the urban areas by land use, mobility, environmental and economic conditions. These indicators have been calculated for a selected set of typical urban areas in Europe on the basis of the results of a survey carried out by means of a specific questionnaire. Once identified the most typical and interesting applications of the road pricing concept in cities such as London (Congestion Charging), Milan (Ecopass), Stockholm (Congestion Tax) and Rome (ZTL), a large benchmarking exercise and the cross analysis of direct and indirect indicators, has allowed to define a simple general model, guidelines and key requirements for the implementation of a pricing scheme based traffic restriction in a generic urban area. The model has been finally applied to the design of a road pricing scheme for a particular area in Madrid, and to the quantification of the expected results of its implementation from a land use, mobility, environmental and economic perspective.