928 resultados para techniques of acting


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work were synthesized and characterized the materials mesoporous SBA-15 and Al- SBA-15, Si / Al = 25, 50 and 75, discovered by researchers at the University of California- Santa Barbara, USA, with pore diameters ranging from 2 to 30 nm and wall thickness from 3.1 to 6.4 nm, making these promising materials in the field of catalysis, particularly for petroleum refining (catalytic cracking), as their mesopores facilitate access of the molecules constituting the oil to active sites, thereby increasing the production of hydrocarbons in the range of light and medium. To verify that the materials used as catalysts were successfully synthesized, they were characterized using techniques of X-ray diffraction (XRD), absorption spectroscopy in the infrared Fourier transform (FT-IR) and adsorption nitrogen (BET). Aiming to check the catalytic activity thereof, a sample of atmospheric residue oil (ATR) from the pole Guamaré-RN was performed the process by means of thermogravimetry and thermal degradation of catalytic residue. Upon the curves, it was observed a reduction in the onset temperature of the decomposition process of catalytic ATR. For the kinetic model proposed by Flynn-Wall yielded some parameters to determine the apparent activation energy of decomposition, being shown the efficiency of mesoporous materials, since there was a decrease in the activation energy for the reactions using catalysts. The ATR was also subjected to pyrolysis process using a pyrolyzer with gas chromatography coupled to a mass spectrometer. Through the chromatograms obtained, there was an increase in the yield of the compounds in the range of gasoline and diesel from the catalytic pyrolysis, with emphasis on Al-SBA-15 (Si / Al = 25), which showed a percentage higher than the other catalysts. These results are due to the fact that the synthesized materials exhibit specific properties for application in the process of pyrolysis of complex molecules and high molecular weight as constituents of the ATR

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is a lack of clinical studies evaluating techniques of functional impression for partially edentulous arches. The aim of this double-blind non-randomized controlled clinical trial was to compare the efficacy of altered cast impression (ACI) and direct functional impression (DFI) techniques. The efficacy was evaluated regarding the number of occlusal units on denture teeth, mucosa integrity at 24-hour follow-up and denture base extension. The sample included 51 patients (female and male) with mean age of 58.96 years treated at Dental Department of UFRN. The patients, exhibiting edentulous maxilla and mandibular Kennedy class I, were divided into two groups (group ACI, n=29; group DFI, n=22). Clinical evaluation was based on the number of occlusal units on natural and/or artificial teeth, mucosa integrity at 24-hour follow-up, and denture base extension. Statistical analysis was conducted using the software SPSS 17.0® (SPSS Inc., Chicago, Illinois). Student T-test was used to reveal association between number of occlusal units and impression technique while chi-square test showed association between mucosa integrity and impression technique. Fischer s exact test was applied for association between denture base extension and impression technique at 95% level of significance. No significant difference was observed between the groups regarding number of occlusal units, mucosa integrity and denture base extension. The altered cast technique did not provide significant improvement in comparison to the direct technique when the number of occlusal units, mucosa integrity and denture base extension

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this work was to study the effect of selective thinning on! the genetic divergence in progenies of Pinus caribaea var. bahamensis, aiming to identify the most productive and divergent progenies for the use of improvement program. The test of progenies containing 119 progenies and two commercial controls were planted in March 1990, using 11 x 11 square lattice design, sextuple, partially balanced, disposed in lineal plots with six trees in the spacing of 3,0 x 3,0m. 13 years after planting thinning was realized (selection for DBH), with 50% selection intensity based on Multi-effect index, leaving three trees per plot in all the experiment. The evaluations were done at four situations: A (before the thinning); B (thinned trees); C (remaining trees after thinning) and D (one year after thinning). The analyzed traits were: height, diameter at breast height (DBH), volume, form of stem and wood density. The genetic divergence among the progenies was studied with aid of the canonical variables and of clustering of Tocher method using the generalized distance matrix of Mahalanobis (D(2)) as estimate of the genetic similarity. The progenies were grouped in four groups in situation A, fourteen in the situation B, two in the situation C and three in the situation D. The selective thinning of the trees within of the progenies caused a change in the genetic divergence among the progenies, genetically homogenizing the progenies, as demonstrated by the generalized distances of Mahalanobis, clustering of Tocher' and canonical variables methods; The. thinning made possible a high uniformity in respect to the relative contribution, of the traits for the total genetic divergence. The techniques, of clustering were efficient to identify groups of divergent,progenies for the use hybridization and little divergent progenies for the use in backcross program.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work was developed in the extent of the Post Graduation Program in Social Service of the Federal University of Rio Grande do Norte. It talks about the process of inclusion of the disabled people in the Job market in Mossoró-RN, bringing for the academic debate relevant thematic for the Brazilian society, for the profession of Social Service and similar areas and for the people with deficiency. It has the objective to apprehend the determiners that make possible the process of the disabled people's inclusion in the Job market in Mossoró, having as parameter the National Politics for the Integration of People Bearers of Deficiency. The critical theoretical perspective is backed in Marx's ideas for the understanding concerning the work, as well as in Pochamann, concerning the job market, regarding the exclusion/inclusion category is based in Martins, Yasbek and Sposati and on deficiency in the National Politics for the Integration of the Disabled People. The research is of qualitative nature and it took as subjects 26 (twenty-six) people, being 09 (nine) people with deficiency, inserted in the formal job and regulated market, and 17 (seventeen) managers of private companies and public institutions of the city of Mossoró-RN. For the collection of data we used techniques of nonsystemic observation, semi-structured interview and documental analysis. The results of the research mark that any modality of the human workforce used in the current context, they are functional to the capitalism and they move forward towards exploration, alienation and subordination of the work to the capital; the National Politics for the Integration of the People with Deficiency expresses and reproduces the contradictory dynamics of the class society, it reflects the neo liberal shades through the selectivity and of the articulation among the federated beings and organizations of the civil society for its operational system; there is a misproportion between the percentages of the quotas and the amount of people with deficiency inserted in the job market, just corresponding to a tiny numeric magnitude; the developed activities are of low social status and it is expressive the amount of workers that receives between one and two minimum wages. These data drive us to infer that the mentioned politics make possible, partly, the inclusion of the disabled people in the job market, though, such inclusion is executed in the selective or focused dimensions, marginal, precarious and unstable

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Smart card applications represent a growing market. Usually this kind of application manipulate and store critical information that requires some level of security, such as financial or confidential information. The quality and trustworthiness of smart card software can be improved through a rigorous development process that embraces formal techniques of software engineering. In this work we propose the BSmart method, a specialization of the B formal method dedicated to the development of smart card Java Card applications. The method describes how a Java Card application can be generated from a B refinement process of its formal abstract specification. The development is supported by a set of tools, which automates the generation of some required refinements and the translation to Java Card client (host) and server (applet) applications. With respect to verification, the method development process was formalized and verified in the B method, using the Atelier B tool [Cle12a]. We emphasize that the Java Card application is translated from the last stage of refinement, named implementation. This translation process was specified in ASF+SDF [BKV08], describing the grammar of both languages (SDF) and the code transformations through rewrite rules (ASF). This specification was an important support during the translator development and contributes to the tool documentation. We also emphasize the KitSmart library [Dut06, San12], an essential component of BSmart, containing models of all 93 classes/interfaces of Java Card API 2:2:2, of Java/Java Card data types and machines that can be useful for the specifier, but are not part of the standard Java Card library. In other to validate the method, its tool support and the KitSmart, we developed an electronic passport application following the BSmart method. We believe that the results reached in this work contribute to Java Card development, allowing the generation of complete (client and server components), and less subject to errors, Java Card applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Although some individual techniques of supervised Machine Learning (ML), also known as classifiers, or algorithms of classification, to supply solutions that, most of the time, are considered efficient, have experimental results gotten with the use of large sets of pattern and/or that they have a expressive amount of irrelevant data or incomplete characteristic, that show a decrease in the efficiency of the precision of these techniques. In other words, such techniques can t do an recognition of patterns of an efficient form in complex problems. With the intention to get better performance and efficiency of these ML techniques, were thought about the idea to using some types of LM algorithms work jointly, thus origin to the term Multi-Classifier System (MCS). The MCS s presents, as component, different of LM algorithms, called of base classifiers, and realized a combination of results gotten for these algorithms to reach the final result. So that the MCS has a better performance that the base classifiers, the results gotten for each base classifier must present an certain diversity, in other words, a difference between the results gotten for each classifier that compose the system. It can be said that it does not make signification to have MCS s whose base classifiers have identical answers to the sames patterns. Although the MCS s present better results that the individually systems, has always the search to improve the results gotten for this type of system. Aim at this improvement and a better consistency in the results, as well as a larger diversity of the classifiers of a MCS, comes being recently searched methodologies that present as characteristic the use of weights, or confidence values. These weights can describe the importance that certain classifier supplied when associating with each pattern to a determined class. These weights still are used, in associate with the exits of the classifiers, during the process of recognition (use) of the MCS s. Exist different ways of calculating these weights and can be divided in two categories: the static weights and the dynamic weights. The first category of weights is characterizes for not having the modification of its values during the classification process, different it occurs with the second category, where the values suffers modifications during the classification process. In this work an analysis will be made to verify if the use of the weights, statics as much as dynamics, they can increase the perfomance of the MCS s in comparison with the individually systems. Moreover, will be made an analysis in the diversity gotten for the MCS s, for this mode verify if it has some relation between the use of the weights in the MCS s with different levels of diversity

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the increasing complexity of software systems, there is also an increased concern about its faults. These faults can cause financial losses and even loss of life. Therefore, we propose in this paper the minimization of faults in software by using formally specified tests. The combination of testing and formal specifications is gaining strength in searches mainly through the MBT (Model-Based Testing). The development of software from formal specifications, when the whole process of refinement is done rigorously, ensures that what is specified in the application will be implemented. Thus, the implementation generated from these specifications would accurately depict what was specified. But not always the specification is refined to the level of implementation and code generation, and in these cases the tests generated from the specification tend to find fault. Additionally, the generation of so-called "invalid tests", ie tests that exercise the application scenarios that were not addressed in the specification, complements more significantly the formal development process. Therefore, this paper proposes a method for generating tests from B formal specifications. This method was structured in pseudo-code. The method is based on the systematization of the techniques of black box testing of boundary value analysis, equivalence partitioning, as well as the technique of orthogonal pairs. The method was applied to a B specification and B test machines that generate test cases independent of implementation language were generated. Aiming to validate the method, test cases were transformed manually in JUnit test cases and the application, created from the B specification and developed in Java, was tested. Faults were found with the execution of the JUnit test cases

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation presents a model-driven and integrated approach to variability management, customization and execution of software processes. Our approach is founded on the principles and techniques of software product lines and model-driven engineering. Model-driven engineering provides support to the specification of software processes and their transformation to workflow specifications. Software product lines techniques allows the automatic variability management of process elements and fragments. Additionally, in our approach, workflow technologies enable the process execution in workflow engines. In order to evaluate the approach feasibility, we have implemented it using existing model-driven engineering technologies. The software processes are specified using Eclipse Process Framework (EPF). The automatic variability management of software processes has been implemented as an extension of an existing product derivation tool. Finally, ATL and Acceleo transformation languages are adopted to transform EPF process to jPDL workflow language specifications in order to enable the deployment and execution of software processes in the JBoss BPM workflow engine. The approach is evaluated through the modeling and modularization of the project management discipline of the Open Unified Process (OpenUP)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

JUSTIFICATIVA E OBJETIVOS: Existem controvérsias quanto à possibilidade de a analgesia de parto interferir no andamento do trabalho de parto e na vitalidade do recém-nascido. O objetivo deste estudo foi a interação entre analgesia do parto pelas técnicas peridural contínua e duplo bloqueio, com pequena dose de anestésico local, e o tipo de parto ocorrido, pela análise do peso e índice de Apgar do recém-nascido. MÉTODO: Analisaram-se, prospectivamente, os resultados de 168 analgesias de parto (janeiro de 2002 a janeiro de 2003), divididas em quatro grupos: G1 (n = 58) peridural contínua e evolução para parto vaginal; G2 (n = 69) duplo bloqueio e evolução para parto vaginal; G3 (n = 25) peridural contínua e evolução para cesariana; G4 (n = 16) duplo bloqueio e evolução para cesariana. Para G1 foi administrada ropivacaína a 0,125% (12 a 15 mL), para G2, bupivacaína a 0,5% (0,5 a 1 mL), sufentanil (10 mg), por via subaracnóidea. Administrou-se ropivacaína a 0,5%, por via peridural, para o parto vaginal (8 mL) e para cesariana (20 mL). Avaliaram-se idade, peso, altura, índice de massa corpórea (IMC), idade gestacional (IG), paridade e complicações (hipotensão arterial, bradicardia e hipóxia), e, do recém-nascido, peso e índice de Apgar (1º, 5º e 10º min). RESULTADOS: A maioria das parturientes era primigesta, com gestação de termo (uma IG de 28 semanas e nenhum pós-datismo), com peso, G2 < G4, e, IMC, G2 £ G4. Para o peso do RN, G1 < G3 e G2 < G4, e o Apgar do 1º min, G1 > G3. CONCLUSÕES: As técnicas de analgesia, peridural contínua e duplo bloqueio, com pequenas doses de anestésico local, não apresentaram interação com o resultado do parto, se a análise estiver focalizada no peso e no índice de Apgar do recém-nascido.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Reconfigurable Computing is an intermediate solution at the resolution of complex problems, making possible to combine the speed of the hardware with the flexibility of the software. An reconfigurable architecture possess some goals, among these the increase of performance. The use of reconfigurable architectures to increase the performance of systems is a well known technology, specially because of the possibility of implementing certain slow algorithms in the current processors directly in hardware. Amongst the various segments that use reconfigurable architectures the reconfigurable processors deserve a special mention. These processors combine the functions of a microprocessor with a reconfigurable logic and can be adapted after the development process. Reconfigurable Instruction Set Processors (RISP) are a subgroup of the reconfigurable processors, that have as goal the reconfiguration of the instruction set of the processor, involving issues such formats, operands and operations of the instructions. This work possess as main objective the development of a RISP processor, combining the techniques of configuration of the set of executed instructions of the processor during the development, and reconfiguration of itself in execution time. The project and implementation in VHDL of this RISP processor has as intention to prove the applicability and the efficiency of two concepts: to use more than one set of fixed instructions, with only one set active in a given time, and the possibility to create and combine new instructions, in a way that the processor pass to recognize and use them in real time as if these existed in the fixed set of instruction. The creation and combination of instructions is made through a reconfiguration unit, incorporated to the processor. This unit allows the user to send custom instructions to the processor, so that later he can use them as if they were fixed instructions of the processor. In this work can also be found simulations of applications involving fixed and custom instructions and results of the comparisons between these applications in relation to the consumption of power and the time of execution, which confirm the attainment of the goals for which the processor was developed

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The techniques of Machine Learning are applied in classification tasks to acquire knowledge through a set of data or information. Some learning methods proposed in literature are methods based on semissupervised learning; this is represented by small percentage of labeled data (supervised learning) combined with a quantity of label and non-labeled examples (unsupervised learning) during the training phase, which reduces, therefore, the need for a large quantity of labeled instances when only small dataset of labeled instances is available for training. A commom problem in semi-supervised learning is as random selection of instances, since most of paper use a random selection technique which can cause a negative impact. Much of machine learning methods treat single-label problems, in other words, problems where a given set of data are associated with a single class; however, through the requirement existent to classify data in a lot of domain, or more than one class, this classification as called multi-label classification. This work presents an experimental analysis of the results obtained using semissupervised learning in troubles of multi-label classification using reliability parameter as an aid in the classification data. Thus, the use of techniques of semissupervised learning and besides methods of multi-label classification, were essential to show the results

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The state has changed over time in order to meet a society with increasingly stringent demands. Techniques of private means begin to be employed in an attempt to overcome the dysfunctions entrenched bureaucracy, making the machine faster. By federal law by the People Management Skills was established as a reference for the administration of Human Resources of the public sector in an attempt to develop professionally servers, based mainly on the three pillars of the model: the knowledge, skills and attitudes. This thesis aims at understanding, in the view of employees, the perceived impacts on the organizational changes occurring in the Department of Administration and Human Resources of the State of Rio Grande do Norte in order to implement a People Management Skills-based. It is a simple case study, characterized by the research during a certain period of time, collecting data in a real environment of an organization, in this case SEARH/RN. The procedures used in collecting data were the literature review, documental research and field research. We used a qualitative approach with exploratory and descriptive approach. Every reform was implemented in the institution and reported from there analyzed the impacts observed by the servers. As a result we observed a considerable advance in institutional activities, mainly relating to physical structure / organizational and human resource policies, with minor advances on labor policies, in much the result of the guiding focus of the reform on SEARH/RN. The impacts in total were more positive than negative and direct paths to improvement in public organizations. Making a general analysis of the modernization program implemented in SEARH/RN, we can conclude that there was a distinct change in all dimensions studied, mostly pointing out positive aspects, and contrary to the opinion of some authors, who claim to be very difficult to implement reforms in public organizations, since they are highly institutionalized environments. What was found was a big organization, with gaps and weaknesses, but with a much larger number of hits and recognition from institutional actors

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present study has addressed the issues surrounding the Solidarity Economy and Tourism, their perspectives and contributions to the development process of local communities, as well as the connection point between the two. This study evaluated the extent to which social economy through cooperatives and associations in tourism has generated socio-economic improvements for the artisans of the tourist Seridó. The study made as to their objectives, was exploratory and descriptive, since it involved both standard techniques of data collection, questionnaires and systematic observation, and secondary research and case studies, which characterizes an exploratory research, according to Castro (2008 ). The results indicate significant improvements offered by the inclusion of members in groups (associations and cooperatives) in matters concerning health, education, interpersonal relationships, and access to consumer credit to the artisans. Through this study, one might also note that the inclusion of products and services in tourism is not a relevant factor for the socio-economic improvements observed for the elevated presence of middlemen in the marketing process

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Panoramic rendering is the visualization of three-dimensional objects in a virtual environment through a wide viewing angle. This work investigated if the use of panoramas is able to promote faster searches in a virtual environment. Panoramas allow the presentation of space through less need to change the orientation of the camera, especially for the case of projections spanning 360º surrounding the user, which can benefit searching. However, the larger the angle, more distorted is the visualization of the environment, causing confusion in navigation. The distortion is even bigger when the user changes the pitch of the camera, by looking up or down. In this work we developed a technique to eliminate specifically the distortions caused by changes in pitch, which was called hemispheric projection. Experiments were done to evaluate the performance of search navigation through perspective, cylindrical and hemispherical projections. The results indicate that navigating with perspective projection is superior than navigating with panoramic projections, possibly due to factors such as (i) lack of experience of the participants in understanding the scenes displayed as panoramas, (ii) the inherent presence of distortion in panoramic projections and (iii) a lower display resolution because the objects are presented in smaller sizes in panoramic projections, making the perception of details more difficult. However, the hemispherical projection was better than the cylindrical, indicating that the developed technique provides benefits for navigation compared to current techniques of panoramic projection. The hemispheric projection also provided the least amount of changes of camera orientation, which is an indication that the hemispheric projections may be particularly useful in situations where there are restrictions on the ease to change the orientation. Future research will investigate the performance of cameras interactions on slower devices, such as using only keyboard, or brain-machine interfaces

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data classification is a task with high applicability in a lot of areas. Most methods for treating classification problems found in the literature dealing with single-label or traditional problems. In recent years has been identified a series of classification tasks in which the samples can be labeled at more than one class simultaneously (multi-label classification). Additionally, these classes can be hierarchically organized (hierarchical classification and hierarchical multi-label classification). On the other hand, we have also studied a new category of learning, called semi-supervised learning, combining labeled data (supervised learning) and non-labeled data (unsupervised learning) during the training phase, thus reducing the need for a large amount of labeled data when only a small set of labeled samples is available. Thus, since both the techniques of multi-label and hierarchical multi-label classification as semi-supervised learning has shown favorable results with its use, this work is proposed and used to apply semi-supervised learning in hierarchical multi-label classication tasks, so eciently take advantage of the main advantages of the two areas. An experimental analysis of the proposed methods found that the use of semi-supervised learning in hierarchical multi-label methods presented satisfactory results, since the two approaches were statistically similar results