903 resultados para Data-driven knowledge acquisition
Resumo:
La evaluación de los procesos formativos en el área clínica, basados en competencias, es fundamental para la Fisioterapia. Este proceso articula la teoría con la práctica, orienta la organización de los procesos académicos, promueve la formación integral y establece mecanismos de retroalimentación permanentes y rigurosos, basados en la evaluación del desempeño del profesional en formación ante situaciones del día a día que dan cuenta de su toma de decisiones profesional. Se espera que los estudiantes se formen y puedan actuar como profesionales competitivos a nivel nacional e internacional, de manera integral, con sólidos compromisos éticos y sociales para responder a las necesidades del entorno social en el que se desenvuelven (PEP, Universidad del Rosario). Los esfuerzos actuales en investigación de la evaluación educativa se encuentran orientados a la integración de la adquisición del conocimiento y al desarrollo de estrategias de medición y cuantificación de capacidades técnico-científicas dentro de cada disciplina. En este sentido, hasta el momento, en Colombia no se encuentra evidencia alrededor de la validación de instrumentos de medición de las competencias clínicas, ni se cuenta con estándares para la evaluación en práctica clínica en el proceso de formación del recurso humano en Fisioterapia. En el proyecto se desarrolló y fueron evaluadas las propiedades psicométricas de un instrumento que mide las competencias clínicas del estudiante de fisioterapia en la práctica clínica. Este proceso involucró a Fisioterapeutas que con experiencia en el área de docencia y clínica, contribuyen con la formación de fisioterapeutas en Colombia.
Resumo:
Resumen basado en el de la publicaci??n
Resumo:
Apesar da modernização dos meios tecnológicos e processos de aprendizagem, a Matemática na escola pública brasileira permanece difícil de ensinar e aprender, falta inovação metodológica que promova condições necessárias na apropriação dos saberes pelo aluno. Essa pesquisa sobre a Formação continuada de professores de Matemática do Ensino Fundamental Ciclo I e inovação da prática pedagógica: a música no ensino de frações propõe o uso da música como recurso didático metodológico inovador para o ensino de frações, com o objetivo de substituir aulas expositivas e exercícios mecânicos por vivências prazerosas, significativas e formadoras de um sujeito crítico participativo. Apresenta os mecanismos de avaliação da política educacional brasileira bem como o Ensino Fundamental de nove anos. Destaca a inovação metodológica como necessidade na formação continuada para o professor polivalente não especialista em matemática. Desenvolve a pesquisa qualitativa, estudo de caso, e considera o processo histórico da sociedade e do sujeito, para compreender o papel da escola, do professor e as especificidades do processo ensino e aprendizagem. O resultados dessa pesquisa mostram a necessidade de revisão, pelas instituições de ensino superior, na formação de profissionais de postura interrogativa de sua própria ação docente, capazes de reproduzir tal atitude no aluno. Este estudo contribui para a aprendizagem de frações, evitando-se aulas expositivas, exercícios mecânicos, por meio de uma proposta de formação continuada, utilizando música como instrumento para o ensino de frações, desenvolvida pela pesquisadora durante o processo da pesquisa ação, além de promover o debate nas unidades escolares envolvidas nas inovações de seus Projetos Pedagógicos.
Resumo:
This Policy Contribution assesses the broad obstacles hampering ICT-led growth in Europe and identifies the main areas in which policy could unlock the greatest value. We review estimates of the value that could be generated through take-up of various technologies and carry out a broad matching with policy areas. According to the literature survey and the collected estimates, the areas in which the right policies could unlock the greatest ICT-led growth are product and labour market regulations and the European Single Market. These areas should be reformed to make European markets more flexible and competitive. This would promote wider adoption of modern data-driven organisational and management practices thereby helping to close the productivity gap between the United States and the European Union. Gains could also be made in the areas of privacy, data security, intellectual property and liability pertaining to the digital economy, especially cloud computing, and next generation network infrastructure investment. Standardisation and spectrum allocation issues are found to be important, though to a lesser degree. Strong complementarities between the analysed technologies suggest, however, that policymakers need to deal with all of the identified obstacles in order to fully realise the potential of ICT to spur long-term growth beyond the partial gains that we report.
Resumo:
The aim of this paper is essentially twofold: first, to describe the use of spherical nonparametric estimators for determining statistical diagnostic fields from ensembles of feature tracks on a global domain, and second, to report the application of these techniques to data derived from a modern general circulation model. New spherical kernel functions are introduced that are more efficiently computed than the traditional exponential kernels. The data-driven techniques of cross-validation to determine the amount elf smoothing objectively, and adaptive smoothing to vary the smoothing locally, are also considered. Also introduced are techniques for combining seasonal statistical distributions to produce longer-term statistical distributions. Although all calculations are performed globally, only the results for the Northern Hemisphere winter (December, January, February) and Southern Hemisphere winter (June, July, August) cyclonic activity are presented, discussed, and compared with previous studies. Overall, results for the two hemispheric winters are in good agreement with previous studies, both for model-based studies and observational studies.
Resumo:
Objective. This study investigated whether trait positive schizotypy or trait dissociation was associated with increased levels of data-driven processing and symptoms of post-traumatic distress following a road traffic accident. Methods. Forty-five survivors of road traffic accidents were recruited from a London Accident and Emergency service. Each completed measures of trait positive schizotypy, trait dissociation, data-driven processing, and post-traumatic stress. Results. Trait positive schizotypy was associated with increased levels of data-driven processing and post-traumatic symptoms during a road traffic accident, whereas trait dissociation was not. Conclusions. Previous results which report a significant relationship between trait dissociation and post-traumatic symptoms may be an artefact of the relationship between trait positive schizotypy and trait dissociation.
Resumo:
Pullpipelining, a pipeline technique where data is pulled from successor stages from predecessor stages is proposed Control circuits using a synchronous, a semi-synchronous and an asynchronous approach are given. Simulation examples for a DLX generic RISC datapath show that common control pipeline circuit overhead is avoided using the proposal. Applications to linear systolic arrays in cases when computation is finished at early stages in the array are foreseen. This would allow run-time data-driven digital frequency modulation of synchronous pipelined designs. This has applications to implement algorithms exhibiting average-case processing time using a synchronous approach.
Resumo:
Transient neural assemblies mediated by synchrony in particular frequency ranges are thought to underlie cognition. We propose a new approach to their detection, using empirical mode decomposition (EMD), a data-driven approach removing the need for arbitrary bandpass filter cut-offs. Phase locking is sought between modes. We explore the features of EMD, including making a quantitative assessment of its ability to preserve phase content of signals, and proceed to develop a statistical framework with which to assess synchrony episodes. Furthermore, we propose a new approach to ensure signal decomposition using EMD. We adapt the Hilbert spectrum to a time-frequency representation of phase locking and are able to locate synchrony successfully in time and frequency between synthetic signals reminiscent of EEG. We compare our approach, which we call EMD phase locking analysis (EMDPL) with existing methods and show it to offer improved time-frequency localisation of synchrony.
Resumo:
Analyzes the use of linear and neural network models for financial distress classification, with emphasis on the issues of input variable selection and model pruning. A data-driven method for selecting input variables (financial ratios, in this case) is proposed. A case study involving 60 British firms in the period 1997-2000 is used for illustration. It is shown that the use of the Optimal Brain Damage pruning technique can considerably improve the generalization ability of a neural model. Moreover, the set of financial ratios obtained with the proposed selection procedure is shown to be an appropriate alternative to the ratios usually employed by practitioners.
Resumo:
Hocaoglu MB, Gaffan EA, Ho AK. The Huntington's disease health-related quality of life questionnaire: a disease-specific measure of health-related quality of life. Huntington's disease (HD) is a genetic neurodegenerative disorder characterized by motor, cognitive and psychiatric disturbances, and yet there is no disease-specific patient-reported health-related quality of life outcome measure for patients. Our aim was to develop and validate such an instrument, i.e. the Huntington's Disease health-related Quality of Life questionnaire (HDQoL), to capture the true impact of living with this disease. Semi-structured interviews were conducted with the full spectrum of people living with HD, to form a pool of items, which were then examined in a larger sample prior to data-driven item reduction. We provide the statistical basis for the extraction of three different sets of scales from the HDQoL, and present validation and psychometric data on these scales using a sample of 152 participants living with HD. These new patient-derived scales provide promising patient-reported outcome measures for HD.
Resumo:
Current methods for estimating event-related potentials (ERPs) assume stationarity of the signal. Empirical Mode Decomposition (EMD) is a data-driven decomposition technique that does not assume stationarity. We evaluated an EMD-based method for estimating the ERP. On simulated data, EMD substantially reduced background EEG while retaining the ERP. EMD-denoised single trials also estimated shape, amplitude, and latency of the ERP better than raw single trials. On experimental data, EMD-denoised trials revealed event-related differences between two conditions (condition A and B) more effectively than trials lowpass filtered at 40 Hz. EMD also revealed event-related differences on both condition A and condition B that were clearer and of longer duration than those revealed by low-pass filtering at 40 Hz. Thus, EMD-based denoising is a promising data-driven, nonstationary method for estimating ERPs and should be investigated further.
Resumo:
This contribution introduces a new digital predistorter to compensate serious distortions caused by memory high power amplifiers (HPAs) which exhibit output saturation characteristics. The proposed design is based on direct learning using a data-driven B-spline Wiener system modeling approach. The nonlinear HPA with memory is first identified based on the B-spline neural network model using the Gauss-Newton algorithm, which incorporates the efficient De Boor algorithm with both B-spline curve and first derivative recursions. The estimated Wiener HPA model is then used to design the Hammerstein predistorter. In particular, the inverse of the amplitude distortion of the HPA's static nonlinearity can be calculated effectively using the Newton-Raphson formula based on the inverse of De Boor algorithm. A major advantage of this approach is that both the Wiener HPA identification and the Hammerstein predistorter inverse can be achieved very efficiently and accurately. Simulation results obtained are presented to demonstrate the effectiveness of this novel digital predistorter design.
Resumo:
There has been an increased emphasis upon the application of science for humanitarian and development planning, decision-making and practice; particularly in the context of understanding, assessing and anticipating risk (e.g. HERR, 2011). However, there remains very little guidance for practitioners on how to integrate sciences they may have had little contact with in the past (e.g. climate). This has led to confusion as to which ‘science’ might be of use and how it would be best utilised. Furthermore, since this integration has stemmed from a need to be more predictive, agencies are struggling with the problems associated with uncertainty and probability. Whilst a range of expertise is required to build resilience, these guidelines focus solely upon the relevant data, information, knowledge, methods, principles and perspective which scientists can provide, that typically lie outside of current humanitarian and development approaches. Using checklists, real-life case studies and scenarios the full guidelines take practitioners through a five step approach to finding, understanding and applying science. This document provides a short summary of the five steps and some key lessons for integrating science.
Resumo:
The induction of classification rules from previously unseen examples is one of the most important data mining tasks in science as well as commercial applications. In order to reduce the influence of noise in the data, ensemble learners are often applied. However, most ensemble learners are based on decision tree classifiers which are affected by noise. The Random Prism classifier has recently been proposed as an alternative to the popular Random Forests classifier, which is based on decision trees. Random Prism is based on the Prism family of algorithms, which is more robust to noise. However, like most ensemble classification approaches, Random Prism also does not scale well on large training data. This paper presents a thorough discussion of Random Prism and a recently proposed parallel version of it called Parallel Random Prism. Parallel Random Prism is based on the MapReduce programming paradigm. The paper provides, for the first time, novel theoretical analysis of the proposed technique and in-depth experimental study that show that Parallel Random Prism scales well on a large number of training examples, a large number of data features and a large number of processors. Expressiveness of decision rules that our technique produces makes it a natural choice for Big Data applications where informed decision making increases the user’s trust in the system.
Resumo:
Assessments concerning the effects of climate change, water resource availability and water deprivation in West Africa have not frequently considered the positive contribution to be derived from collecting and reusing water for domestic purposes. Where the originating water is taken from a clean water source and has been used the first time for washing or bathing, this water is commonly called “greywater”. Greywater is a prolific resource that is generated wherever people live. Treated greywater can be used for domestic cleaning, for flushing toilets where appropriate, for washing cars, sometimes for watering kitchen gardens, and for clothes washing prior to rinsing. Therefore, a large theoretical potential exists to increase total water resource availability if greywater were to be widely reused. Locally treated greywater reduces the distribution network requirement, lower construction effort and cost and, wherever possible, minimising the associated carbon footprint. Such locally treated greywater offers significant practical opportunities for increasing the total available water resources at a local level. The reuse of treated greywater is one important action that will help to mitigate the reducing availability of clean water supplies in some areas, and the expected mitigation required in future aligns well with WHO/UNICEF (2012) aspirations. The evaluation of potential opportunities for prioritising greywater systems to support water reuse takes into account the availability of water resources, water use indicators and published estimates in order to understand typical patterns of water demand. The approach supports knowledge acquisition regarding local conditions for enabling capacity building for greywater reuse, the understanding of systems that are most likely to encourage greywater reuse, and practices and future actions to stimulate greywater infrastructure planning, design and implementation. Although reuse might be considered to increase the uncertainty of achieving a specified quality of the water supply, robust methods and technologies are available for local treatment. Resource strategies for greywater reuse have the potential to consistently improve water efficiency and availability in water impoverished and water stressed regions of Ghana and West Africa. Untreated greywater is referred to as “greywater”; treated greywater is referred to as “treated greywater” in this paper.