1000 resultados para Programadores de computador - Análise profissiográfica


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esta investigação teve como objetivo estudar a experiência de utilização de tablets com conteúdos educativos por estudantes do 3º ano do Ensino Primário, em contexto de sala de aula, além de identificar fatores positivos e negativos relacionados com a interação com os tablets. Pretendemos também perceber se é possível utilizar o tablet como principal ferramenta de suporte ao ensino. O estudo decorreu durante a implementação de um projeto-piloto, para o qual foi instalado um sistema tecnológico adaptado ao ensino. Foi utilizada uma amostra de 19 estudantes, com idades compreendidas entre os 8 e 9 anos. As metodologias de estudo usadas, nomeadamente observações, entrevistas e entrevistas contextuais, permitiram a recolha de dados qualitativos e quantitativos. Ficou demonstrado que utilizadores com um nível de literacia digital reduzido conseguem ter experiências satisfatórias e que a capacidade de adaptação e aprendizagem respondem às necessidades. Pela análise aos dados obtidos, pudemos concluir que os tablets podem substituir os livros e cadernos em papel, desde que integrados num sistema tecnológico adaptado, e se tiverem especificações técnicas, periféricos e aplicações semelhantes ao material disponível no projeto-piloto estudado. Ainda assim, identificámos algumas limitações importantes e sugerimos algumas soluções. As conclusões devem ser consideradas para a avaliação, desenho e implementação de novas interfaces de utilizador, baseadas em eventos de toque e destinadas à utilização num contexto similar ao descrito no estudo. As características específicas do contexto onde decorreu este estudo deixam algumas questões em aberto, que devem ser estudadadas em trabalhos futuros.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research studies the application of syntagmatic analysis of written texts in the language of Brazilian Portuguese as a methodology for the automatic creation of extractive summaries. The automation of abstracts, while linked to the area of natural language processing (PLN) is studying ways the computer can autonomously construct summaries of texts. For this we use as presupposed the idea that switch to the computer the way a language is structured, in our case the Brazilian Portuguese, it will help in the discovery of the most relevant sentences, and consequently build extractive summaries with higher informativeness. In this study, we propose the definition of a summarization method that automatically perform the syntagmatic analysis of texts and through them, to build an automatic summary. The phrases that make up the syntactic structures are then used to analyze the sentences of the text, so the count of these elements determines whether or not a sentence will compose the summary to be generated

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cephalometric analysis is the mensuration of linear and angular measures through demarcation points as distances and lines on teleradiography, and is considered of fundamental importance for diagnosis and orthodontic planning. In this manner, the objective of this research was to compare cephalometric measurements obtained by dentists and radiologists from the analysis of the same radiograph, in a computerized cephalometric analysis program. All research participants marked 18 cephalometric points on a 14-inch notebook computer, as directed by the program itself (Radiocef 2®). From there, they generated 14 cephalometric parameters including skeletal, dental-skeletal, dental and soft tissue. In order to verify the intra-examiner agreement, 10 professionals from each group repeated the marking of the points with a minimum interval of eight days between the two markings. The intra-group variability was calculated based on the coefficients of variation (CV). The comparison between groups was performed using the Student t-test for normally distributed variables, and using the Mann-Whitney test for those with non-normal distribution. In the group of orthodontists, the measurements of Pog and 1-NB, SL, S-Ls Line, S-Li Line and 1.NB showed high internal variability. In the group of radiologists, the same occurred with the values of Pog and 1-NB, S-Ls Line, S-Li Line and 1.NA. In the comparison between groups, all the analyzed linear values and two angular values showed statistically significant differences between radiologists and dentists (p <0.05). According to the results, the interexaminer error in cephalometric analysis requires more attention, but does not come from a specific class of specialists, being either dentists or radiologists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents an analysis of the behavior of some algorithms usually available in stereo correspondence literature, with full HD images (1920x1080 pixels) to establish, within the precision dilemma versus runtime applications which these methods can be better used. The images are obtained by a system composed of a stereo camera coupled to a computer via a capture board. The OpenCV library is used for computer vision operations and processing images involved. The algorithms discussed are an overall method of search for matching blocks with the Sum of the Absolute Value of the difference (Sum of Absolute Differences - SAD), a global technique based on cutting energy graph cuts, and a so-called matching technique semi -global. The criteria for analysis are processing time, the consumption of heap memory and the mean absolute error of disparity maps generated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of using software based on numerical approximations for metal forming is given by the need to ensure process efficiency in order to get high quality products at lowest cost and shortest time. This study uses the theory of similitude in order to develop a technique capable of simulating the stamping process of a metal sheet, obtaining results close to the real values, with shorter processing times. The results are obtained through simulations performed in the finite element software STAMPACK®. This software uses the explicit integration method in time, which is usually applied to solve nonlinear problems involving contact, such as the metal forming processes. The technique was developed from a stamping model of a square box, simulated with four different scale factors, two higher and two smaller than the real scale. The technique was validated with a bending model of a welded plate, which had a high simulation time. The application of the technique allowed over 50% of decrease in the time of simulation. The results for the application of the scale technique for forming plates were satisfactory, showing good quantitative results related to the decrease of the total time of simulation. Finally, it is noted that the decrease in simulation time is only possible with the use of two related scales, the geometric and kinematic scale. The kinematic scale factors should be used with caution, because the high speeds can cause dynamic problems and could influence the results of the simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reinforced concrete creep is a phenomenon of great importance. Despite being appointed as the main cause of several pathologies, its effects are yet considered in a simplified way by the structural designers. In addition to studying the phenomenon in reinforced concrete structures and its current account used in the structural analysis, this paper compares creep strains at simply supported reinforced concrete beams in analytical and in experimental forms with the finite element method (FEM) simulation results. The strains and deflections obtained through the analytical form were calculated with the Brazilian code NBR 6118 (2014) recommendations and the simplified method from CEB-FIP 90 and the experimental results were extracted from tests available in the literature. Finite element simulations are performed using ANSYS Workbench software, using its 3D SOLID 186 elements and the structure symmetry. Analyzes of convergence using 2D PLANE 183 elements are held as well. At the end, it is concluded that FEM analyses are quantitative and qualitative efficient for the estimation of this non-linearity and that the method utilized to obtain the creep coefficients values is sufficiently accurate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho propõe um estudo de sinais cerebrais aplicados em sistemas BCI (Brain-Computer Interface - Interfaces Cérebro Computador), através do uso de Árvores de Decisão e da análise dessas árvores com base nas Neurociências. Para realizar o tratamento dos dados são necessárias 5 fases: aquisição de dados, pré-processamento, extração de características, classificação e validação. Neste trabalho, todas as fases são contempladas. Contudo, enfatiza-se as fases de classificação e de validação. Na classificação utiliza-se a técnica de Inteligência Artificial denominada Árvores de Decisão. Essa técnica é reconhecida na literatura como uma das formas mais simples e bem sucedidas de algoritmos de aprendizagem. Já a fase de validação é realizada nos estudos baseados na Neurociência, que é um conjunto das disciplinas que estudam o sistema nervoso, sua estrutura, seu desenvolvimento, funcionamento, evolução, relação com o comportamento e a mente, e também suas alterações. Os resultados obtidos neste trabalho são promissores, mesmo sendo iniciais, visto que podem melhor explicar, com a utilização de uma forma automática, alguns processos cerebrais.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A selecção do tema e consequente trabalho de que emerge o titulo desta dissertação decorreu do facto de se ter tomado conhecimento da necessidade que os membros do projecto FCOMP-01-0124-FEDER-007360 - Inquirir da Honra: Comissários do Santo Oficio e das Ordens Militares em Portugal (1570 - 1773) tiveram para satisfazer alguns objectivos em particular relacionados com a Genealogia da rede de Comissários. O sistema de trabalho manual que até aqui era utilizado, continha uma quantidade considerável de informação complexa, descrevendo ao pormenor as características não só dos indivíduos, mas também do que estava associado ao mesmo, incluindo quem e como se relacionava com as demais figuras. O principal objectivo consistiu assim em responder à pergunta: "Como será possível efectuar uma gestão de toda a informação genealógica recolhida no papel e permitir a sua análise no computador, recorrendo a tecnologias que, por um lado sejam eficientes, e por outro, fáceis de aprender pelos utilizadores?". Para conseguir responder à questão, foi necessário conhecer em primeira mão, o universo da Genealogia e a forma como opera, para que posteriormente, se desenhasse e moldasse toda uma aplicação às necessidades do utilizador. No entanto, a aplicação não se centra apenas em permitir ao utilizador efectuar uma gestão, recorrendo a um sistema de gestão de bases de dados MySQL e permitir a análise genealógica "tradicional" em programas como o Personal Ancestral File. Pretende-se sobretudo, que o utilizador faça uso e responda às perguntas "do presente" esperando que a própria aplicação sirva de motivação para novas perguntas, com a integração da tecnologia XML e do Sistema de Informação Geográfico, Google Earth, permitindo assim a análise de informação genealógica no mapa-mundo. ABSTRACT: The choice of this essay's work subject is set on the need to accomplish determinate goals related with the Genealogy of the network lnquisition Commissioners on behalf of the project FCOMP-01-0124-FEDER-007360 members - Inquirir da Honra: Comissários do Santo Ofício e das Ordens Militares em Portugal (1570 - 1773)- To Inquire Honor: Inquisition Commissioners and the Military Orders in Portugal. The manual work system used till now presented a considerable amount of complex information, describing in detail characteristics not only of individuals but also of what is associated to it, including whoandhow. The main goal aimed at thus responding to: «How could it be possible to select and examine all the genealogical data registered on paper and allow it to be analyzed on computer, by means of technology that on one hand are efficient and on other hand easy to learn by its users? ». ln order to get to the answer to that matter, it was necessary to acknowledge firstly the Genealogy's universe so afterwards it could be possible to outline and shape an entire application to user needs. Nevertheless, the application does not only focus on allowing the user to carry out the system’s management, using MySQL database management system and allowing the "traditional" genealogical management in programs such as the Personal Ancestral File. Above all the user should get involved with it and answer the key questions of 'the present’ hoping that the application serves by itself as motivation to arouse new questions with the integration of XML technology and Geographic Information System, Google Earth, thus allowing the analysis of genealogical information worldwide.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The model for end-stage liver disease (MELD) was developed to predict short-term mortality in patients with cirrhosis. There are few reports studying the correlation between MELD and long-term posttransplantation survival. AIM: To assess the value of pretransplant MELD in the prediction of posttransplant survival. METHODS: The adult patients (age >18 years) who underwent liver transplantation were examined in a retrospective longitudinal cohort of patients, through the prospective data base. We excluded acute liver failure, retransplantation and reduced or split-livers. The liver donors were evaluated according to: age, sex, weight, creatinine, bilirubin, sodium, aspartate aminotransferase, personal antecedents, brain death cause, steatosis, expanded criteria donor number and index donor risk. The recipients' data were: sex, age, weight, chronic hepatic disease, Child-Turcotte-Pugh points, pretransplant and initial MELD score, pretransplant creatinine clearance, sodium, cold and warm ischemia times, hospital length of stay, blood requirements, and alanine aminotransferase (ALT >1,000 UI/L = liver dysfunction). The Kaplan-Meier method with the log-rank test was used for the univariable analyses of posttransplant patient survival. For the multivariable analyses the Cox proportional hazard regression method with the stepwise procedure was used with stratifying sodium and MELD as variables. ROC curve was used to define area under the curve for MELD and Child-Turcotte-Pugh. RESULTS: A total of 232 patients with 10 years follow up were available. The MELD cutoff was 20 and Child-Turcotte-Pugh cutoff was 11.5. For MELD score > 20, the risk factors for death were: red cell requirements, liver dysfunction and donor's sodium. For the patients with hyponatremia the risk factors were: negative delta-MELD score, red cell requirements, liver dysfunction and donor's sodium. The regression univariated analyses came up with the following risk factors for death: score MELD > 25, blood requirements, recipient creatinine clearance pretransplant and age donor >50. After stepwise analyses, only red cell requirement was predictive. Patients with MELD score < 25 had a 68.86%, 50,44% and 41,50% chance for 1, 5 and 10-year survival and > 25 were 39.13%, 29.81% and 22.36% respectively. Patients without hyponatremia were 65.16%, 50.28% and 41,98% and with hyponatremia 44.44%, 34.28% and 28.57% respectively. Patients with IDR > 1.7 showed 53.7%, 27.71% and 13.85% and index donor risk <1.7 was 63.62%, 51.4% and 44.08%, respectively. Age donor > 50 years showed 38.4%, 26.21% and 13.1% and age donor <50 years showed 65.58%, 26.21% and 13.1%. Association with delta-MELD score did not show any significant difference. Expanded criteria donors were associated with primary non-function and severe liver dysfunction. Predictive factors for death were blood requirements, hyponatremia, liver dysfunction and donor's sodium. CONCLUSION: In conclusion MELD over 25, recipient's hyponatremia, blood requirements, donor's sodium were associated with poor survival.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Errors are always present in experimental measurements so, it is important to identify them and understand how they affect the results of experiments. Statistics suggest that the execution of experiments should follow random order, but unfortunately the complete randomization of experiments is not always viable for practical reasons. One possible simplification is blocked experiments within which the levels of certain factors are maintained fixed while the levels of others are randomized. However this has a cost. Although the experimental part is simplified, the statistical analysis becomes more complex.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chromatography combined with several different detection systems is one of the more used and better performing analytical tools. Chromatography with tandem mass spectrometric detection gives highly selective and sensitive analyses and permits obtaining structural information about the analites and about their molar masses. Due to these characteristics, this analytical technique is very efficient when used to detect substances at trace levels in complex matrices. In this paper we review instrumental and technical aspects of chromatography-tandem mass spectrometry and the state of the art of the technique as it is applied to analysis of toxic substances in food.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method to quantify lycopene and β-carotene in freeze dried tomato pulp by high performance liquid chromatography (HLPC) was validated according to the criteria of selectivity, sensitivity, precision and accuracy, and uncertainty estimation of measurement was determined with data obtained in the validation. The validated method presented is selective in terms of analysis, and it had a good precision and accuracy. Detection limit for lycopene and β-carotene was 4.2 and 0.23 mg 100 g-1, respectively. The estimation of expanded uncertainty (K = 2) for lycopene was 104 ± 21 mg 100 g-1 and for β-carotene was 6.4 ± 1.5 mg 100 g-1.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This manuscript aims to show the basic concepts and practical application of Principal Component Analysis (PCA) as a tutorial, using Matlab or Octave computing environment for beginners, undergraduate and graduate students. As a practical example it is shown the exploratory analysis of edible vegetable oils by mid infrared spectroscopy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The difficulty in adapting European dairy cows breeds in Brazil affect considerably the milk production sector. Brazilian climatic conditions are not totally favorable and the development of new tecnologies is needed for the animals express their genetic potential, as well as their best feed conversion. An economical analysis of the applied investment in the free-stall climatization equipment in dairy housing, for estimating studies related to profit, possibility of return investment as well as time for this return is necessary. The objective of this research was to evaluate the influence of climatization investment in the milk production process and analyze the economical aspect of this investment. There were used 470 high productive dairy cows with genetic and morphologic homogeneous characteristics, and analyzed in two similar periods. Investment calculations were done using Excell®. The results were satisfactory and the invested capital was proved to return to the producer in a short term, 57 days.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this work was to analyze the logistical distribution of Brazilian soybean by applying a quadratic programming to a spatial equilibrium model. The soybean transportation system is an important part of the soybean complex in Brazil, since the major part of the costs of this commodity derives from the transportation costs. Therefore, the optimization of this part of the process is essential to a better competitiveness of the Brazilian soybean in the international market. The Brazilian soybean complex have been increasing its agricultural share in the total of the exportation value in the last ten years, but due to other countries' investments the Brazilian exportations cannot be only focused on increasing its production but it still have to be more efficient. This way, a model was reached which can project new frames by switching the transportation costs and conduce the policy makers to new investments in the sector.