905 resultados para computer-aided qualitative data analysis software
Resumo:
En este estudio se realizó un análisis predictivo de la aparición de eventos adversos de los pacientes de una IPS de Bogotá, Mederi Hospital Universitario de Barrios Unidos (HUBU) durante el año 2013; relacionados con los indicadores de eficiencia hospitalaria (Porcentaje de ocupación hospitalaria, número de egresos hospitalarios, promedio de estancia hospitalaria, número de egresos de urgencias, promedio de estancia en urgencias). Los datos fueron exportados a una matriz de análisis de las variables cualitativas; fueron presentadas con frecuencias absolutas y relativas, las variables cuantitativas (edad, tiempos de estancia) fueron presentadas con media, desviaciones estándar. Se agruparon los datos de eventos adversos y de eficiencia hospitalaria en una nueva matriz que permitiera el análisis predictivo la nueva matriz fue exportada al software de modelación estadístico Eviews 6.5; se especificaron modelos predictivos multivariados para la variable número de eventos adversos, respecto de los indicadores de eficiencia hospitalaria y se estimaron las probabilidades de ocurrencia, análisis de correlación y multicolinealidad; los resultados se presentaron en tablas de estimación para cada modelo, se restringieron los eventos adversos prevenibles y no prevenibles información obtenida a través de un sistema de información que registra los factores relacionados con la ocurrencia de eventos adversos en salud, a través del sistema de reporte de eventos en salud, reporte en las historias clínicas, reporte individual, reporte por servicio, análisis de datos y estudios de caso, de la misma forma fueron extraídos los datos de eficiencia hospitalaria para el mismo periodo. El análisis y gestión de eventos adversos pretende establecer estrategias de mejoramiento continuo y análisis de resultados frente a los indicadores de eficiencia que permitan intervención de los factores de riesgo operativo de los servicios del Hospital Universitario de Barrios Unidos (HUBU), relacionados con eventos adversos en la atención de los pacientes en especial se debe enfocar en la gestión de los egresos de pacientes de acuerdo a los resultados obtenidos con el fin de alinearse y fortalecer las políticas de seguridad del paciente para brindar una atención integral con calidad y eficiencia, disminuyendo las quejas en la atención, las glosas, los riesgos jurídicos, de acuerdo al modelo predictivo estudiado.
Resumo:
Nesse trabalho investigamos de que maneira a escola vem desenvolvendo as questões referentes à educação ambiental enquanto tema transversal e interdisciplinar. Levantamos informações sobre como a Educação Ambiental vem sendo desenvolvida na prática pedagógica de uma escola da rede pública estadual da cidade de Mossoró/RN - Brasil. Onde foi identificada a percepção dos atores envolvidos no processo de educação ambiental, a saber: o nível de consciência ecológica, manifestada pelos alunos, suas práticas em relação aos problemas ambientais vivenciados; a abordagem do docente, frente à temática; bem como a percepção dos representantes do poder, como o professor, a diretoria da escola, a secretária de educação do estado e gerente de meio ambiente do município de Mossoró – RN. Contando com um apanhado bibliográfico com autores como Saviani (2008), Dias (2004), Gadotti (2008), Paulo Freire (1991), Sato (2012), Loureiro (2004), Leff (2010), entre outros. Para essa investigação utilizamos uma abordagem qualitativa e quantitativa, sendo desenvolvido 4 entrevistas com os representantes do poder e um questionário que foi aplicado com os alunos da escola, depois de respondidos esses dados foram tabulados em planilhas do Excel a fim de serem lançados para análises estatísticas, logo em seguida foram tratados através da construção de um banco de dados na planilha eletrônica Microsoft Excel. Após a digitação da base de dados, o banco foi exportado para o software SPSS versão 13.0 no qual foi realizada a análise. Para análise dos dados foram calculadas as frequências observadas e percentuais das percepções dos alunos acerca do julgamento, procedimentos utilizados pela escola, itens associados, problemas e temas relacionados ao meio ambiente. Além das frequências calculadas foram construídos os gráficos para cada distribuição. Já a análise qualitativa de conteúdo possui como estratégia de análise a interpretação qualitativa de emparelhamento de dados. Percebemos que nossos sujeitos acreditam que a educação ambiental vem como instrumento para modificação de comportamentos humanos, é através da educação que modificamos atitudes e conscientizamos a nossa população aos cuidados para com o nosso planeta. Nesta investigação identificamos que 83,1% dos alunos disseram estar bastante consciente da problemática ambiental, e ainda, 71,8% dos discentes disseram que estão bastante motivados para desenvolver projetos de educação ambiental na sua escola. Todavia não foi constatado isso pelos representantes do poder os quais afirma que esses não possuem o nível de consciência ecológica identificada pelos alunos, podendo perceber uma visão crítica por parte dos representantes do poder a respeito da temática, diferente dos discentes que dizem ter consciência, contudo suas práticas não condizem com a realidade. Acreditamos que se a Educação Ambiental fosse introduzida como componente curricular obrigatória essa poderia ser trabalhada de maneira mais direta e contundente a fim de formamos cidadãos verdadeiramente consciente da questão ambiental, uma vez que essa deve ir além dos muros da escola, a questão ambiental é uma questão também social, necessitamos de intervenções a nível global afim de todos contribuírem de maneira significativa para sustentabilidade.
Resumo:
A preocupação central dessa pesquisa foi compreender se dá o ensino da Educação Ambiental na perspectiva da sustentabilidade, no cotidiano da sala de aula no Projovem Urbano na região metropolitana do Recife, Pernambuco, Brasil. A pesquisa foi desenvolvida com 110 alunos de ambos os sexos, de escolas públicas da RMR e matriculados no curso do Projovem e 10 professores que lecionam nesse projeto. Para isso aplicamos o questionário adaptado com os alunos e com os professores utilizamos uma entrevista semi-estruturada. Na realização da Análise dos dados quantitativos utilizamos o Software Package for Social Sciences – SPSS versão 18.0 e na elaboração dos gráficos o Software Microsoft Excel 2007; enquanto a análise dos dados qualitativos foi orientada pela Análise do Discurso – AD. Os resultados demonstram que apesar de A Educação Ambiental e a Sustentabilidade serem um tema recente, já está fazendo parte das salas de aulas objetivando a formação de cidadãos conscientes das suas atitudes para com o meio ambiente. Podemos verificar que ainda faz necessário o investimento cada vez mais na educação para que possamos formar cada vez mais cidadãos a fim de mantermos uma relação harmoniosa entre o homem e a natureza, possibilitando com isso um ambiente sustentável para a presente e futuras gerações. Com base na pesquisa, podemos verificar que ainda é pouco o investimento em palestras, reuniões e eventos voltados para os professores, para que os mesmo possuam mais conhecimento para aplicar de melhor forma de acordo com a necessidade da comunidade em que a escola está inserida com o intuito de promover sempre a EA e a sustentabilidade, além de proporcionar um melhor ambiente para a comunidade, com a minimização dos problemas enfrentados pelos mesmos, como é a questão do lixo, que não é apenas uma questão ambiental, mas também de saúde, já que o mesmo pode transmitir várias doenças através dos insetos, roedores e outros.
Resumo:
Os Sistemas de Informação têm influenciado a vida quotidiana a um ritmo inesperado e com mudanças significativas na setor da construção, cuja importância é crucial para a economia de qualquer país. Sendo Angola, um país de economia emergente, caracterizado por um mercado em expansão e reestruturação, onde as decisões de Sistema Informação ainda são tomadas de forma isolada, e cada vez mais investidores estão a atuar no país, impulsionando o crescimento, é imperioso para o sector a identificação e exploração de Sistemas de Informação flexíveis e adaptáveis para fazer face as forças competitivas do setor. Partindo desta envolvente a presente dissertação visa ressaltar o reconhecimento da importância do Planeamento Estratégico de Sistemas de Informação-PESI para as organizações atuais. Neste contexto realizou-se o estudo de PESI em torno de um caso concreto numa Pequena Media Empresa, Angolana, a Terponte,SA, cujo objectivo é fornecer a construtora um plano em termos de SI para futuro. Os dados foram recolhidos através da utilização dos métodos qualitativos e quantitativos, sendo os mais pertinentes a análise documental, observação direta, entrevistas, questionários aos órgãos da empresa em estudo e demais intuições. A análise dos dados demonstrou uma deficiente gestão da informação, bem como inexistência de integração aplicacional. A presente dissertação pretende contribuir para o avanço do conhecimento científico no domínio do Planeamento Estratégico dos Sistemas de Informação e para a resolução de problemas específicos nomeadamente nas empresas do setor da construção civil.
Resumo:
Pair Programming is a technique from the software development method eXtreme Programming (XP) whereby two programmers work closely together to develop a piece of software. A similar approach has been used to develop a set of Assessment Learning Objects (ALO). Three members of academic staff have developed a set of ALOs for a total of three different modules (two with overlapping content). In each case a pair programming approach was taken to the development of the ALO. In addition to demonstrating the efficiency of this approach in terms of staff time spent developing the ALOs, a statistical analysis of the outcomes for students who made use of the ALOs is used to demonstrate the effectiveness of the ALOs produced via this method.
Resumo:
A recent area for investigation into the development of adaptable robot control is the use of living neuronal networks to control a mobile robot. The so-called Animat paradigm comprises a neuronal network (the ‘brain’) connected to an external embodiment (in this case a mobile robot), facilitating potentially robust, adaptable robot control and increased understanding of neural processes. Sensory input from the robot is provided to the neuronal network via stimulation on a number of electrodes embedded in a specialist Petri dish (Multi Electrode Array (MEA)); accurate control of this stimulation is vital. We present software tools allowing precise, near real-time control of electrical stimulation on MEAs, with fast switching between electrodes and the application of custom stimulus waveforms. These Linux-based tools are compatible with the widely used MEABench data acquisition system. Benefits include rapid stimulus modulation in response to neuronal activity (closed loop) and batch processing of stimulation protocols.
Resumo:
GODIVA2 is a dynamic website that provides visual access to several terabytes of physically distributed, four-dimensional environmental data. It allows users to explore large datasets interactively without the need to install new software or download and understand complex data. Through the use of open international standards, GODIVA2 maintains a high level of interoperability with third-party systems, allowing diverse datasets to be mutually compared. Scientists can use the system to search for features in large datasets and to diagnose the output from numerical simulations and data processing algorithms. Data providers around Europe have adopted GODIVA2 as an INSPIRE-compliant dynamic quick-view system for providing visual access to their data.
Resumo:
Technology-enhanced or Computer Aided Learning (e-learning) can be institutionally integrated and supported by learning management systems or Virtual Learning Environments (VLEs) to offer efficiency gains, effectiveness and scalability of the e-leaning paradigm. However this can only be achieved through integration of pedagogically intelligent approaches and lesson preparation tools environment and VLE that is well accepted by both the students and teachers. This paper critically explores some of the issues relevant to scalable routinisation of e-learning at the tertiary level, typically first year university undergraduates, with the teaching of Relational Data Analysis (RDA), as supported by multimedia authoring, as a case study. The paper concludes that blended learning approaches which balance the deployment of e-learning with other modalities of learning delivery such as instructor–mediated group learning etc offer the most flexible and scalable route to e-learning but that this requires the graceful integration of platforms for multimedia production, distribution and delivery through advanced interactive spaces that provoke learner engagement and promote learning autonomy and group learning facilitated by a cooperative-creative learning environment that remains open to personal exploration of constructivist-constructionist pathways to learning.
Resumo:
The role of users is an often-overlooked aspect of studies of innovation and diffusion. Using an actor-network theory (ANT) approach, four case studies examine the processes of implementing a piece of CAD (computer aided design) software, BSLink, in different organisations and describe the tailoring done by users to embed the software into working practices. This not only results in different practices of use at different locations, but also transforms BSLink itself into a proliferation of BSLinks-in-use. A focus group for BSLink users further reveals the gaps between different users' expectations and ways of using the software, and between different BSLinks-in-use. It also demonstrates the contradictory demands this places on its further development. The ANT-informed approach used treats both innovation and diffusion as processes of translation within networks. It also emphasises the political nature of innovation and implementation, and the efforts of various actors to delegate manoeuvres for increased influence onto technological artefacts.
Resumo:
Virtual reality has the potential to improve visualisation of building design and construction, but its implementation in the industry has yet to reach maturity. Present day translation of building data to virtual reality is often unidirectional and unsatisfactory. Three different approaches to the creation of models are identified and described in this paper. Consideration is given to the potential of both advances in computer-aided design and the emerging standards for data exchange to facilitate an integrated use of virtual reality. Commonalities and differences between computer-aided design and virtual reality packages are reviewed, and trials of current system, are described. The trials have been conducted to explore the technical issues related to the integrated use of CAD and virtual environments within the house building sector of the construction industry and to investigate the practical use of the new technology.
Resumo:
Metabolic stable isotope labeling is increasingly employed for accurate protein (and metabolite) quantitation using mass spectrometry (MS). It provides sample-specific isotopologues that can be used to facilitate comparative analysis of two or more samples. Stable Isotope Labeling by Amino acids in Cell culture (SILAC) has been used for almost a decade in proteomic research and analytical software solutions have been established that provide an easy and integrated workflow for elucidating sample abundance ratios for most MS data formats. While SILAC is a discrete labeling method using specific amino acids, global metabolic stable isotope labeling using isotopes such as (15)N labels the entire element content of the sample, i.e. for (15)N the entire peptide backbone in addition to all nitrogen-containing side chains. Although global metabolic labeling can deliver advantages with regard to isotope incorporation and costs, the requirements for data analysis are more demanding because, for instance for polypeptides, the mass difference introduced by the label depends on the amino acid composition. Consequently, there has been less progress on the automation of the data processing and mining steps for this type of protein quantitation. Here, we present a new integrated software solution for the quantitative analysis of protein expression in differential samples and show the benefits of high-resolution MS data in quantitative proteomic analyses.
Resumo:
The organization of non-crystalline polymeric materials at a local level, namely on a spatial scale between a few and 100 a, is still unclear in many respects. The determination of the local structure in terms of the configuration and conformation of the polymer chain and of the packing characteristics of the chain in the bulk material represents a challenging problem. Data from wide-angle diffraction experiments are very difficult to interpret due to the very large amount of information that they carry, that is the large number of correlations present in the diffraction patterns.We describe new approaches that permit a detailed analysis of the complex neutron diffraction patterns characterizing polymer melts and glasses. The coupling of different computer modelling strategies with neutron scattering data over a wide Q range allows the extraction of detailed quantitative information on the structural arrangements of the materials of interest. Proceeding from modelling routes as diverse as force field calculations, single-chain modelling and reverse Monte Carlo, we show the successes and pitfalls of each approach in describing model systems, which illustrate the need to attack the data analysis problem simultaneously from several fronts.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
The UK Department for Environment, Food and Rural Affairs (Defra) identified practices to reduce the risk of animal disease outbreaks. We report on the response of sheep and pig farmers in England to promotion of these practices. A conceptual framework was established from research on factors influencing adoption of animal health practices, linking knowledge, attitudes, social influences and perceived constraints to the implementation of specific practices. Qualitative data were collected from nine sheep and six pig enterprises in 2011. Thematic analysis explored attitudes and responses to the proposed practices, and factors influencing the likelihood of implementation. Most feel they are doing all they can reasonably do to minimise disease risk and that practices not being implemented are either not relevant or ineffective. There is little awareness and concern about risk from unseen threats. Pig farmers place more emphasis than sheep farmers on controlling wildlife, staff and visitor management and staff training. The main factors that influence livestock farmers’ decision on whether or not to implement a specific disease risk measure are: attitudes to, and perceptions of, disease risk; attitudes towards the specific measure and its efficacy; characteristics of the enterprise which they perceive as making a measure impractical; previous experience of a disease or of the measure; and the credibility of information and advice. Great importance is placed on access to authoritative information with most seeing vets as the prime source to interpret generic advice from national bodies in the local context. Uptake of disease risk measures could be increased by: improved risk communication through the farming press and vets to encourage farmers to recognise hidden threats; dissemination of credible early warning information to sharpen farmers’ assessment of risk; and targeted information through training events, farming press, vets and other advisers, and farmer groups, tailored to the different categories of livestock farmer.