985 resultados para Data handling
Resumo:
Neste estudo, focado na aprendizagem do manuseio do dinheiro, pretendeu-se que os alunos adquirissem competências que os habilitasse a um maior grau de independência e participação na vida em sociedade, desempenhando tarefas de cariz financeiro de forma mais independente, por exemplo, compra de produtos, pagamento de serviços e gestão do dinheiro. Para alcançar o pretendido, utilizou-se a metodologia do ensino direto, com tarefas estruturadas. Numa fase inicial o investigador prestava apoio constante aos alunos, que foi diminuindo gradualmente à medida que atingiam as competências relacionadas com o dinheiro. Na fase final, os alunos realizaram as tarefas propostas de forma autónoma. Construído como um estudo de caso, os dados foram recolhidos através de observação direta e de provas de monitorização. Os alunos começaram por realizar uma avaliação inicial para delinear a linha de base da intervenção. Posteriormente, foi realizada a intervenção baseada no ensino direto, com recurso ao computador, à calculadora, a provas de monitorização e ao manuseio de dinheiro. O computador foi utilizado na intervenção como tecnologia de apoio à aprendizagem, permitindo a realização de jogos interativos e consulta de materiais. No final da intervenção os alunos revelaram autonomia na resolução das tarefas, pois já tinham automatizado os processos matemáticas para saber manusear corretamente a moeda euro. O ensino direto auxiliou os alunos a reterem as competências matemáticas essenciais de manuseamento do dinheiro, compondo quantias, efetuando pagamentos e conferindo trocos, que muito podem contribuir para terem uma participação independente na vida em sociedade
Resumo:
This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.
Resumo:
Complex industrial plants exhibit multiple interactions among smaller parts and with human operators. Failure in one part can propagate across subsystem boundaries causing a serious disaster. This paper analyzes the industrial accident data series in the perspective of dynamical systems. First, we process real world data and show that the statistics of the number of fatalities reveal features that are well described by power law (PL) distributions. For early years, the data reveal double PL behavior, while, for more recent time periods, a single PL fits better into the experimental data. Second, we analyze the entropy of the data series statistics over time. Third, we use the Kullback–Leibler divergence to compare the empirical data and multidimensional scaling (MDS) techniques for data analysis and visualization. Entropy-based analysis is adopted to assess complexity, having the advantage of yielding a single parameter to express relationships between the data. The classical and the generalized (fractional) entropy and Kullback–Leibler divergence are used. The generalized measures allow a clear identification of patterns embedded in the data.
Resumo:
Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.
Resumo:
Demo in Workshop on ns-3 (WNS3 2015). 13 to 14, May, 2015. Castelldefels, Spain.
Resumo:
Data Mining (DM) methods are being increasingly used in prediction with time series data, in addition to traditional statistical approaches. This paper presents a literature review of the use of DM with time series data, focusing on short- time stocks prediction. This is an area that has been attracting a great deal of attention from researchers in the field. The main contribution of this paper is to provide an outline of the use of DM with time series data, using mainly examples related with short-term stocks prediction. This is important to a better understanding of the field. Some of the main trends and open issues will also be introduced.
Resumo:
Relatório de Estágio apresentado para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Novos Media e Práticas Web
Resumo:
Twenty-four whole blood and serum samples were drawn from an eight year-old heart transplant child during a 36 months follow-up. EBV serology was positive for VCA-IgM and IgG, and negative for EBNA-IgG at the age of five years old when the child presented with signs and symptoms suggestive of acute infectious mononucleosis. After 14 months, serological parameters were: positive VCA-IgG, EBNA-IgG and negative VCA-IgM. This serological pattern has been maintained since then even during episodes suggestive of EBV reactivation. PCR amplified a specific DNA fragment from the EBV gp220 (detection limit of 100 viral copies). All twenty-four whole blood samples yielded positive results by PCR, while 12 out of 24 serum samples were positive. We aimed at analyzing whether detection of EBV-DNA in serum samples by PCR was associated with overt disease as stated by the need of antiviral treatment and hospitalization. Statistical analysis showed agreement between the two parameters evidenced by the Kappa test (value 0.750; p < 0.001). We concluded that detection of EBV-DNA in serum samples of immunosuppressed patients might be used as a laboratory marker of active EBV disease when a Real-Time PCR or another quantitative method is not available.
Resumo:
40 Echinococcus isolates from sheep and cattle in Southern Brazil were genetically analysed in order to obtain further data on the presence of different taxa of the Echinococcus granulosus complex. Differentiation was done using a PCR technique and sequencing of mitochondrial cytochrome c oxidase subunit 1 (CO1). Most samples (38) could be allocated to the sheep strain (G1) of E. granulosus, while two samples belonged to E. ortleppi, previously known as cattle strain (G5) of E. granulosus. Due to the shorter prepatent period in dogs of the latter taxon, this records have important implications for the design of control measures in this endemic region.
Resumo:
A informação assume, hoje em dia, uma importância crescente. Desde a sua constituição, as organizações produzem diariamente informação que alimenta o seu sistema de informação organizacional. Este, passa por um ciclo de vida que abrange processos relacionados com o seu planeamento e desenvolvimento, sem o qual não seria possível tomar decisões e dar resposta às solicitações do meio envolvente, devido ao enorme volume de dados a processar pelas organizações. Com este estágio pretende-se abordar a importância da informação na gestão do património da associação ATAHCA – Associação de Desenvolvimento das Terras Altas do Homem, Cávado e Ave, onde se incluem edifícios, mobiliário, obras de arte, máquinas, utensílios, ferramentas, meios de transporte e documentos. Assim, o objetivo deste trabalho consiste em desenvolver um sistema de apoio à tomada de decisão baseado na inventariação de todo o património. A criação de um manual de procedimentos é essencial para garantir o correto manuseamento do sistema e servirá de contributo à gestão eficaz da informação. O sistema de informação a desenvolver será um modelo de apoio à decisão que permita fazer a gestão do inventário/património, mas também que possibilite a quantificação e o valor patrimonial do mesmo. Pretende-se, ainda, discutir e analisar o contributo da gestão da informação no apoio à tomada de decisão assertiva e rentável para a organização.
Resumo:
This paper presents the design of low cost, small autonomous surface vehicle for missions in the coastal waters and specifically for the challenging surf zone. The main objective of the vehicle design described in this paper is to address both the capability of operation at sea in relative challenging conditions and maintain a very low set of operational requirements (ease of deployment). This vehicle provides a first step towards being able to perform general purpose missions (such as data gathering or patrolling) and to at least in a relatively short distances to be able to be used in rescue operations (with very low handling requirements) such as carrying support to humans on the water. The USV is based on a commercially available fiber glass hull, it uses a directional waterjet powered by an electrical brushless motor for propulsion, thus without any protruding propeller reducing danger in rescue operations. Its small dimensions (1.5 m length) and weight allow versatility and ease of deployment. The vehicle design is described in this paper both from a hardware and software point of view. A characterization of the vehicle in terms of energy consumption and performance is provided both from test tank and operational scenario tests. An example application in search and rescue is also presented and discussed with the integration of this vehicle in the European ICARUS (7th framework) research project addressing the development and integration of robotic tools for large scale search and rescue operations.
Resumo:
Trabalho de Projecto apresentado como requisito parcial para obtenção do grau de Mestre em Estatística e Gestão de Informação.
Resumo:
Project submitted as part requirement for the degree of Masters in English teaching,
Resumo:
The incidence of Candida bloodstream infection has increased over the past years. In the Center-West region of Brazil, data on candidemia are scarce. This paper reports a retrospective analysis of 96 cases of Candida bloodstream infection at a Brazilian tertiary-care teaching hospital in the state of Mato Grosso do Sul, from January 1998 to December 2006. Demographic, clinical and laboratory data were collected from medical records and from the hospital's laboratory database. Patients' ages ranged from three days to 92 years, with 53 (55.2%) adults and 43 (44.8%) children. Of the latter, 25 (58.1%) were newborns. The risk conditions most often found were: long period of hospitalization, utilization of venous central catheter, and previous use of antibiotics. Fifty-eight (60.4%) patients died during the hospitalization period and eight (13.7%) of them died 30 days after the diagnosis of candidemia. Candida albicans (45.8%) was the most prevalent species, followed by C. parapsilosis (34.4%), C. tropicalis (14.6%) and C. glabrata (5.2%). This is the first report of Candida bloodstream infection in the state of Mato Grosso do Sul and it highlights the importance of considering the possibility of invasive Candida infection in patients exposed to risk factors, particularly among neonates and the elderly.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática