773 resultados para heterogeneous data sources


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This qualitative case study explored three teacher candidates’ learning and enactment of discourse-focused mathematics teaching practices. Using audio and video recordings of their teaching practice this study aimed to identify the shifts in the way in which the teacher candidates enacted the following discourse practices: elicited and used evidence of student thinking, posed purposeful questions, and facilitated meaningful mathematical discourse. The teacher candidates’ written reflections from their practice-based coursework as well as interviews were examined to see how two mathematics methods courses influenced their learning and enactment of the three discourse focused mathematics teaching practices. These data sources were also used to identify tensions the teacher candidates encountered. All three candidates in the study were able to successfully enact and reflect on these discourse-focused mathematics teaching practices at various time points in their preparation programs. Consistency of use and areas of improvement differed, however, depending on various tensions experienced by each candidate. Access to quality curriculum materials as well as time to formulate and enact thoughtful lesson plans that supported classroom discourse were tensions for these teacher candidates. This study shows that teacher candidates are capable of enacting discourse-focused teaching practices early in their field placements and with the support of practice-based coursework they can analyze and reflect on their practice for improvement. This study also reveals the importance of assisting teacher candidates in accessing rich mathematical tasks and collaborating during lesson planning. More research needs to be explored to identify how specific aspects of the learning cycle impact individual teachers and how this can be used to improve practice-based teacher education courses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objectives: Our aim was to study the effect of combination therapy with aspirin and dipyridamole (A+D) over aspirin alone (ASA) in secondary prevention after transient ischemic attack or minor stroke of presumed arterial origin and to perform subgroup analyses to identify patients that might benefit most from secondary prevention with A+D. Data sources: The previously published meta-analysis of individual patient data was updated with data from ESPRIT (N=2,739); trials without data on the comparison of A+D versus ASA were excluded. Review methods: A meta-analysis was performed using Cox regression, including several subgroup analyses and following baseline risk stratification. Results: A total of 7,612 patients (5 trials) were included in the analyses, 3,800 allocated to A+D and 3,812 to ASA alone. The trial-adjusted hazard ratio for the composite event of vascular death, non-fatal myocardial infarction and non-fatal stroke was 0.82 (95% confidence interval 0.72-0.92). Hazard ratios did not differ in subgroup analyses based on age, sex, qualifying event, hypertension, diabetes, previous stroke, ischemic heart disease, aspirin dose, type of vessel disease and dipyridamole formulation, nor across baseline risk strata as assessed with two different risk scores. A+D were also more effective than ASA alone in preventing recurrent stroke, HR 0.78 (95% CI 0.68 – 0.90). Conclusion: The combination of aspirin and dipyridamole is more effective than aspirin alone in patients with TIA or ischemic stroke of presumed arterial origin in the secondary prevention of stroke and other vascular events. This superiority was found in all subgroups and was independent of baseline risk. ---------------------------7dc3521430776 Content-Disposition: form-data; name="c14_creators_1_name_family" Halkes

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation focuses on the greenhouse and nursery industry in the United States. Two major issues are explored: irrigation and plant disease. The first two essays examine wireless soil-moisture sensor networks, an emerging technology that measures soil moisture and optimizes irrigation levels in real time. The first essay describes a study in which a nationwide survey of commercial growers was administered to generate estimates of grower demand and willingness to pay for sensor networks. We find that adoption rates for a base system and demand for expansion components are decreasing in price, as expected. The price elasticity of the probability of adoption suggests that sensor networks are likely to diffuse at a rate somewhat greater than that of drip irrigation. In the second essay, yields, time-to-harvest, and plant quality were analyzed to measure sensor network profitability. Sensor-based irrigation was found to increase revenue by 62% and profit by 65% per year. The third essay investigates greenhouse nursery growers’ response to a quarantine imposed on the west coast of the United States from 2002 to present for the plant pathogen that causes Sudden Oak Death. I investigate whether growers choose to 1) improve their sanitation practices, which reduces the underlying risk of disease without increasing the difficulty of detecting the pathogen, 2) increase fungicide use, which also prevents disease but makes existing infections much harder to detect, or 3) change their crop composition towards more resistant species. First, a theoretical model is derived to formalize hypotheses on grower responses to the quarantine, and then these predictions are empirically tested using several public data sources. I do not find evidence that growers improve their sanitation practices in response to the quarantine. I do, however, find evidence that growers heavily increase their fungicide use in response to a quarantine policy that requires visual (as opposed to laboratory) inspection for the disease before every crop shipment, suggesting that the quarantine may have the adverse effect of making the pathogen harder to identify. I also do find evidence that growers shift away from susceptible crops and towards resistant crops.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: Cost-effectiveness analysis of a 6-month treatment of apixaban (10 mg/12h, first 7 days; 5 mg/12h afterwards) for the treatment of the first event of venous thromboembolism (VTE) and prevention of recurrences, versus low-molecular-weight heparins/vitamin K antagonists treatment (LMWH/VKA). Material and methods: A lifetime Markov model with 13 health states was used for describing the course of the disease. Efficacy and safety data were obtained from AMPLIFY and AMPLIFY-EXT clinical trials; health outcomes were measured as life years gained (LYG) and quality-adjusted life years (QALY). The chosen perspective of this analysis has been the Spanish National Health System (NHS). Drugs, management of VTE and complications costs were obtained from several Spanish data sources (€, 2014). A 3% discount rate was applied to health outcomes and costs. Univariate and probabilistic sensitivity analyses (SA) were performed in order to assess the robustness of the results. Results: Apixaban was the most effective therapy with 7.182 LYG and 5.865 QALY, versus 7.160 LYG and 5.838 QALYs with LMWH/VKA. Furthermore, apixaban had a lower total cost (€13,374.70 vs €13,738.30). Probabilistic SA confirmed dominance of apixaban (led to better health outcomes with less associated costs) in 89% of the simulations. Conclusions: Apixaban 5 mg/12h versus LMWH/VKA was an efficient therapeutic strategy for the treatment and prevention of recurrences of VTE from the NHS perspective.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Analysis of data without labels is commonly subject to scrutiny by unsupervised machine learning techniques. Such techniques provide more meaningful representations, useful for better understanding of a problem at hand, than by looking only at the data itself. Although abundant expert knowledge exists in many areas where unlabelled data is examined, such knowledge is rarely incorporated into automatic analysis. Incorporation of expert knowledge is frequently a matter of combining multiple data sources from disparate hypothetical spaces. In cases where such spaces belong to different data types, this task becomes even more challenging. In this paper we present a novel immune-inspired method that enables the fusion of such disparate types of data for a specific set of problems. We show that our method provides a better visual understanding of one hypothetical space with the help of data from another hypothetical space. We believe that our model has implications for the field of exploratory data analysis and knowledge discovery.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tese (doutorado)—Universidade de Brasília, Instituto de Ciências Sociais, Departamento de Sociologia, 2014.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação de mest. em Didáctica das Línguas e Culturas Modernas Especialização Inglês, Faculdade de Ciências Humanas e Sociais, Univ. do Algarve, 2003

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A Emenda Constitucional 64/2010 garantiu Direito Humano à Alimentação como direito básico e social, alterando o Artigo 6º da Constituição Federal. O artigo analisa as significativas implicações desta alteração na gestão das políticas públicas brasileiras Ao assegurar o Direito à Alimentação como direito básico e social, a Carta constituiu um dever, ou uma obrigação positiva do Estado brasileiro. O artigo discute também o significado desta mudança para o sistema brasileiro de informações, argumentando que já existem fontes de dados e sistema de indicadores construídos para o monitoramento consistente das situações de (in)segurança alimentar e nutricional no país, restando agora ao governo federal e aos gestores do Sistema Brasileiro de Informações Estatísticas e Geográficas definir a regularidade e a frequência da aplicação e divulgação destes instrumentos. Nossa atenção se concentrará basicamente nas possibilidades de uso da Pesquisa de Orçamentos Familiares e da Pesquisa Nacional por Amostra Domiciliar como fontes de dados. _______________________________________________________________________________ ABSTRACT

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Libraries since their inception 4000 years ago have been in a process of constant change. Although, changes were in slow motion for centuries, in the last decades, academic libraries have been continuously striving to adapt their services to the ever-changing user needs of students and academic staff. In addition, e-content revolution, technological advances, and ever-shrinking budgets have obliged libraries to efficiently allocate their limited resources among collection and services. Unfortunately, this resource allocation is a complex process due to the diversity of data sources and formats required to be analyzed prior to decision-making, as well as the lack of efficient integration methods. The main purpose of this study is to develop an integrated model that supports libraries in making optimal budgeting and resource allocation decisions among their services and collection by means of a holistic analysis. To this end, a combination of several methodologies and structured approaches is conducted. Firstly, a holistic structure and the required toolset to holistically assess academic libraries are proposed to collect and organize the data from an economic point of view. A four-pronged theoretical framework is used in which the library system and collection are analyzed from the perspective of users and internal stakeholders. The first quadrant corresponds to the internal perspective of the library system that is to analyze the library performance, and costs incurred and resources consumed by library services. The second quadrant evaluates the external perspective of the library system; user’s perception about services quality is judged in this quadrant. The third quadrant analyses the external perspective of the library collection that is to evaluate the impact of the current library collection on its users. Eventually, the fourth quadrant evaluates the internal perspective of the library collection; the usage patterns followed to manipulate the library collection are analyzed. With a complete framework for data collection, these data coming from multiple sources and therefore with different formats, need to be integrated and stored in an adequate scheme for decision support. A data warehousing approach is secondly designed and implemented to integrate, process, and store the holistic-based collected data. Ultimately, strategic data stored in the data warehouse are analyzed and implemented for different purposes including the following: 1) Data visualization and reporting is proposed to allow library managers to publish library indicators in a simple and quick manner by using online reporting tools. 2) Sophisticated data analysis is recommended through the use of data mining tools; three data mining techniques are examined in this research study: regression, clustering and classification. These data mining techniques have been applied to the case study in the following manner: predicting the future investment in library development; finding clusters of users that share common interests and similar profiles, but belong to different faculties; and predicting library factors that affect student academic performance by analyzing possible correlations of library usage and academic performance. 3) Input for optimization models, early experiences of developing an optimal resource allocation model to distribute resources among the different processes of a library system are documented in this study. Specifically, the problem of allocating funds for digital collection among divisions of an academic library is addressed. An optimization model for the problem is defined with the objective of maximizing the usage of the digital collection over-all library divisions subject to a single collection budget. By proposing this holistic approach, the research study contributes to knowledge by providing an integrated solution to assist library managers to make economic decisions based on an “as realistic as possible” perspective of the library situation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: Traditionally, upper airway examination is performed while the patient is awake. However, in the past two decades, drug-induced sleep endoscopy (DISE) has been used as a method of tridimensional evaluation of the upper airway during pharmacologically induced sleep. This study aimed to systematically review the evidence regarding the usefulness of DISE compared with that of traditional awake examination for surgical decision making in patients with obstructive sleep apnea (OSA). DATA SOURCES: Scopus, PubMed, and Cochrane Library databases were searched. REVIEW METHODS: Only studies with a primary objective of evaluating the usefulness of DISE for surgical decision making in patients with OSA were selected. The included studies directly compared awake examination data with DISE outcome data in terms of possible influences on surgical decision making and operation success. RESULTS: A total of eight studies with 535 patients were included in this review. Overall, the surgical treatment changed after DISE in 50.24% (standard deviation 8.4) cases. These changes were more frequently associated with structures contributing to hypopharyngeal or laryngeal obstruction. However, these differences do not automatically indicate a higher success rate. CONCLUSION: This review emphasized the direct impact of DISE compared with that of awake examination on surgical decision making in OSA patients. However, it is also clear that the available published studies lack evidence on the association between this impact and surgical outcomes

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Context: To assess the efficacy of preoperative chemotherapy in Wilms’ tumor patients and explore its true value for specific subgroups. Objectives: In the presence of these controversies, a meta-analysis that examines the efficacy of preoperative chemotherapy in Wilms’ tumor patients and specific subgroups is needed to clarify these issues. The objective of this meta-analysis is to assess the efficacy of preoperative chemotherapy in Wilms’ tumor patients and explore its true value for specific subgroups. Data Sources: Computer-based systematic search with “preoperative chemotherapy”, “Neoadjuvant Therapy” and “Wilms’ tumor” as search terms till January 2013 was performed. Study Selection: No language restrictions were applied. Searches were limited to randomized clinical trials (RCTs) or retrospective studies in human participants under 18 years. A manual examination of references in selected articles was also performed. Data Extraction: Relative Risk (RR) and their 95% Confidence Interval (CI) for Tumor Shrinkage (TS), total Tumor Resection (TR), Event-Free Survival (EFS) and details of subgroup analysis were extracted. Meta-analysis was carried out with the help of the software STATA 11.0. Finally, four original Randomized Clinical Trials (RCTs) and 28 retrospective studies with 2375 patients were included. Results: For preoperative chemotherapy vs. up-front surgery (PC vs. SU) group, the pooled RR was 9.109 for TS (95% CI: 5.109 - 16.241; P < 0.001), 1.291 for TR (95% CI: 1.124 - 1.483; P < 0.001) and 1.101 for EFS (95% CI: 0.980 - 1.238; P = 0.106). For subgroup short course vs. long course (SC vs. LC), the pooled RR was 1.097 for TS (95% CI: 0.784 - 1.563; P = 0.587), 1.197 for TR (95% CI: 0.960 - 1.493; P = 0.110) and 1.006 for EFS (95% CI: 0.910 - 1.250; P = 0.430). Conclusions: Short course preoperative chemotherapy is as effective as long course and preoperative chemotherapy only benefits Wilms’ tumor patients in tumor shrinkage and resection but not event-free survival.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Context:Blood pressure (BP) tracks from childhood to adulthood, and has ethnic variations. Therefore, it is important to assess the situation of pediatric BP in different populations. This study aims to systematically review the studies conducted on BP in Iranian children and adolescents. Evidence Acquisition: We conducted a systematic review on published and national data about pediatric BP in Iran, our search was conducted in Pub Med, Medline, ISI, and Scopus, as well as in national databases including Scientific Information database (SID), IranMedex and Irandoc from 1990 to 2014. Results: We found 1373 records in the primary search including 840 from international and 533 from national databases. After selection and quality assessment phases, data were extracted from 36 papers and four national data sources. Mean systolic BP (SBP) varied from 90.1 ± 14 mmHg (95% CI 89.25, 90.94) to 120.2 ± 12.3 (118.98, 121.41) mmHg, and for diastolic BP (DBP) from 50.7 ± 11.4 (50.01, 51.38) to 79.2 ± 12.3 (77.95, 80.44) mmHg. The frequency of elevated BP had large variation in sub-national studies with rates as low as 0.4% (0.009, 1.98) for high SBP and as high as 24.1% (20.8, 27.67) for high DBP. At national level, three surveys reported slightly raised rates of elevated BP from 2009 to 2012. Conclusions: The findings provide practical information on BP levels in Iranian pediatric population. Although differences exist on the findings of various studies, this review underscores the necessity of tracking BP from childhood, and implementing interventions for primordial prevention of hypertension.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Credible spatial information characterizing the structure and site quality of forests is critical to sustainable forest management and planning, especially given the increasing demands and threats to forest products and services. Forest managers and planners are required to evaluate forest conditions over a broad range of scales, contingent on operational or reporting requirements. Traditionally, forest inventory estimates are generated via a design-based approach that involves generalizing sample plot measurements to characterize an unknown population across a larger area of interest. However, field plot measurements are costly and as a consequence spatial coverage is limited. Remote sensing technologies have shown remarkable success in augmenting limited sample plot data to generate stand- and landscape-level spatial predictions of forest inventory attributes. Further enhancement of forest inventory approaches that couple field measurements with cutting edge remotely sensed and geospatial datasets are essential to sustainable forest management. We evaluated a novel Random Forest based k Nearest Neighbors (RF-kNN) imputation approach to couple remote sensing and geospatial data with field inventory collected by different sampling methods to generate forest inventory information across large spatial extents. The forest inventory data collected by the FIA program of US Forest Service was integrated with optical remote sensing and other geospatial datasets to produce biomass distribution maps for a part of the Lake States and species-specific site index maps for the entire Lake State. Targeting small-area application of the state-of-art remote sensing, LiDAR (light detection and ranging) data was integrated with the field data collected by an inexpensive method, called variable plot sampling, in the Ford Forest of Michigan Tech to derive standing volume map in a cost-effective way. The outputs of the RF-kNN imputation were compared with independent validation datasets and extant map products based on different sampling and modeling strategies. The RF-kNN modeling approach was found to be very effective, especially for large-area estimation, and produced results statistically equivalent to the field observations or the estimates derived from secondary data sources. The models are useful to resource managers for operational and strategic purposes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Ph.D, Education) -- Queen's University, 2016-09-22 22:05:24.246

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data sources are often dispersed geographically in real life applications. Finding a knowledge model may require to join all the data sources and to run a machine learning algorithm on the joint set. We present an alternative based on a Multi Agent System (MAS): an agent mines one data source in order to extract a local theory (knowledge model) and then merges it with the previous MAS theory using a knowledge fusion technique. This way, we obtain a global theory that summarizes the distributed knowledge without spending resources and time in joining data sources. New experiments have been executed including statistical significance analysis. The results show that, as a result of knowledge fusion, the accuracy of initial theories is significantly improved as well as the accuracy of the monolithic solution.