872 resultados para heterogeneous data sources
Resumo:
The research aims to answer a fundamental question: which of the disability models currently in use is optimal for creating “accessible tourism-oriented” amenities, as well as more detailed problems: (1) what is disability and what determines different disability models? (2) what types of tourism market supply available for the disabled do the different disability models suggest? (3) are the disability models complementary or mutually exclusive? (4) is the idea of social integration and inclusion of people with disabilities (PWD) while on tourist trips supported of the society? Data sources comprise selected literature and results of a survey conducted using the face-to-face method and the SurveyMonkey website from May 2013 to July 2014. The surveyed group included 619 people (82% were Polish, the other 18% were foreigners from: Russia, Germany, Portugal, Slovakia, Canada, Tunisia and the United Kingdom). The research showed that the different disability models – medical, social, geographical and economic – are useful when creating the tourism supply for the PWD. Using the research results, the authors suggested a model of “diversification of tourism market supply structure available for the disabled”, which includes different types of supply – from specialist to universal. This model has practical usage and can help entrepreneurs with the segmentation of tourism offers addressed to the PWD. The work is innovative, both in its theoretical approach (the review of disability models and their practical application in creating tourism supply) and empirical values – it provides current data for the social attitude towards the development of PWD tourism. Especially the presentation of a wide range of perception of disability as well as the simple classification of tourism supply that meets the varied needs of PWD, is a particular novelty of this chapter.
Resumo:
Planners require solutions that address routine work needs and seems essential to improving efficiency and productivity. There are a great number of different factors related to beekeeper activity as well the quality and productivity of different bee products. The spatial analysis is a powerful tool for overlap and relates various levels of information on a map, and consequently a very useful for beekeeping activity planning. This work proposes and applies a methodology to potential beekeeping assessment in Montesinho Natural Park, a region in the northwest of Portugal. The beekeeping potential maps were developed with the following data sources: legal standards, vegetation, land use, topography, water resources, roads, electromagnetic fields, and some honey physico-chemical analysis. The design and implementation of spatial analysis model based on Geographic Information System (GIS) to beekeeping planning activities has already been described by Anjos et al (2014). Spatial analysis techniques allows to define the potential beekeeper map supporting the beekeeper management in this region. Anjos O, Silva G, Roque N, Fernandez P, 2014. GIS based analysis to support the beekeeping planning. Book of abstracts of the International Symposium on Bee Products 3rd edition – Annual meeting of the International Honey Commission (IHC), Faculty of medicine, University of Rijeka, p:61
Resumo:
In this paper, we implement an anomaly detection system using the Dempster-Shafer method. Using two standard benchmark problems we show that by combining multiple signals it is possible to achieve better results than by using a single signal. We further show that by applying this approach to a real-world email dataset the algorithm works for email worm detection. Dempster-Shafer can be a promising method for anomaly detection problems with multiple features (data sources), and two or more classes.
Resumo:
Life Cycle Climate Performance (LCCP) is an evaluation method by which heating, ventilation, air conditioning and refrigeration systems can be evaluated for their global warming impact over the course of their complete life cycle. LCCP is more inclusive than previous metrics such as Total Equivalent Warming Impact. It is calculated as the sum of direct and indirect emissions generated over the lifetime of the system “from cradle to grave”. Direct emissions include all effects from the release of refrigerants into the atmosphere during the lifetime of the system. This includes annual leakage and losses during the disposal of the unit. The indirect emissions include emissions from the energy consumption during manufacturing process, lifetime operation, and disposal of the system. This thesis proposes a standardized approach to the use of LCCP and traceable data sources for all aspects of the calculation. An equation is proposed that unifies the efforts of previous researchers. Data sources are recommended for average values for all LCCP inputs. A residential heat pump sample problem is presented illustrating the methodology. The heat pump is evaluated at five U.S. locations in different climate zones. An excel tool was developed for residential heat pumps using the proposed method. The primary factor in the LCCP calculation is the energy consumption of the system. The effects of advanced vapor compression cycles are then investigated for heat pump applications. Advanced cycle options attempt to reduce the energy consumption in various ways. There are three categories of advanced cycle options: subcooling cycles, expansion loss recovery cycles and multi-stage cycles. The cycles selected for research are the suction line heat exchanger cycle, the expander cycle, the ejector cycle, and the vapor injection cycle. The cycles are modeled using Engineering Equation Solver and the results are applied to the LCCP methodology. The expander cycle, ejector cycle and vapor injection cycle are effective in reducing LCCP of a residential heat pump by 5.6%, 8.2% and 10.5%, respectively in Phoenix, AZ. The advanced cycles are evaluated with the use of low GWP refrigerants and are capable of reducing the LCCP of a residential heat by 13.7%, 16.3% and 18.6% using a refrigerant with a GWP of 10. To meet the U.S. Department of Energy’s goal of reducing residential energy use by 40% by 2025 with a proportional reduction in all other categories of residential energy consumption, a reduction in the energy consumption of a residential heat pump of 34.8% with a refrigerant GWP of 10 for Phoenix, AZ is necessary. A combination of advanced cycle, control options and low GWP refrigerants are necessary to meet this goal.
Resumo:
The goal was to understand, document and module how information is currently flown internally in the largest dairy organization in Finland. The organization has undergone radical changes in the past years due to economic sanctions between European Union and Russia. Therefore, organization’s ultimate goal would be to continue its growth through managing its sales process more efficiently. The thesis consists of a literature review and an empirical part. The literature review consists of knowledge management and process modeling theories. First, the knowledge management discusses how data, information and knowledge are exchanged in the process. Knowledge management models and processes are describing how knowledge is created, exchanged and can be managed in an organization. Secondly, the process modeling is responsible for visualizing information flow through discussion of modeling approaches and presenting different methods and techniques. Finally, process’ documentation procedure was presented. In the end, a constructive research approach was used in order to identify process’ related problems and bottlenecks. Therefore, possible solutions were presented based on this approach. The empirical part of the study is based on 37 interviews, organization’s internal data sources and theoretical framework. The acquired data and information were used to document and to module the sales process in question with a flowchart diagram. Results are conducted through construction of the flowchart diagram and analysis of the documentation. In fact, answers to research questions are derived from empirical and theoretical parts. In the end, 14 problems and two bottlenecks were identified in the process. The most important problems are related to approach and/or standardization for information sharing, insufficient information technology tool utilization and lack of systematization of documentation. The bottlenecks are caused by the alarming amount of changes to files after their deadlines.
Resumo:
This qualitative case study explored three teacher candidates’ learning and enactment of discourse-focused mathematics teaching practices. Using audio and video recordings of their teaching practice this study aimed to identify the shifts in the way in which the teacher candidates enacted the following discourse practices: elicited and used evidence of student thinking, posed purposeful questions, and facilitated meaningful mathematical discourse. The teacher candidates’ written reflections from their practice-based coursework as well as interviews were examined to see how two mathematics methods courses influenced their learning and enactment of the three discourse focused mathematics teaching practices. These data sources were also used to identify tensions the teacher candidates encountered. All three candidates in the study were able to successfully enact and reflect on these discourse-focused mathematics teaching practices at various time points in their preparation programs. Consistency of use and areas of improvement differed, however, depending on various tensions experienced by each candidate. Access to quality curriculum materials as well as time to formulate and enact thoughtful lesson plans that supported classroom discourse were tensions for these teacher candidates. This study shows that teacher candidates are capable of enacting discourse-focused teaching practices early in their field placements and with the support of practice-based coursework they can analyze and reflect on their practice for improvement. This study also reveals the importance of assisting teacher candidates in accessing rich mathematical tasks and collaborating during lesson planning. More research needs to be explored to identify how specific aspects of the learning cycle impact individual teachers and how this can be used to improve practice-based teacher education courses.
Resumo:
Objectives: Our aim was to study the effect of combination therapy with aspirin and dipyridamole (A+D) over aspirin alone (ASA) in secondary prevention after transient ischemic attack or minor stroke of presumed arterial origin and to perform subgroup analyses to identify patients that might benefit most from secondary prevention with A+D. Data sources: The previously published meta-analysis of individual patient data was updated with data from ESPRIT (N=2,739); trials without data on the comparison of A+D versus ASA were excluded. Review methods: A meta-analysis was performed using Cox regression, including several subgroup analyses and following baseline risk stratification. Results: A total of 7,612 patients (5 trials) were included in the analyses, 3,800 allocated to A+D and 3,812 to ASA alone. The trial-adjusted hazard ratio for the composite event of vascular death, non-fatal myocardial infarction and non-fatal stroke was 0.82 (95% confidence interval 0.72-0.92). Hazard ratios did not differ in subgroup analyses based on age, sex, qualifying event, hypertension, diabetes, previous stroke, ischemic heart disease, aspirin dose, type of vessel disease and dipyridamole formulation, nor across baseline risk strata as assessed with two different risk scores. A+D were also more effective than ASA alone in preventing recurrent stroke, HR 0.78 (95% CI 0.68 – 0.90). Conclusion: The combination of aspirin and dipyridamole is more effective than aspirin alone in patients with TIA or ischemic stroke of presumed arterial origin in the secondary prevention of stroke and other vascular events. This superiority was found in all subgroups and was independent of baseline risk. ---------------------------7dc3521430776 Content-Disposition: form-data; name="c14_creators_1_name_family" Halkes
Resumo:
This dissertation focuses on the greenhouse and nursery industry in the United States. Two major issues are explored: irrigation and plant disease. The first two essays examine wireless soil-moisture sensor networks, an emerging technology that measures soil moisture and optimizes irrigation levels in real time. The first essay describes a study in which a nationwide survey of commercial growers was administered to generate estimates of grower demand and willingness to pay for sensor networks. We find that adoption rates for a base system and demand for expansion components are decreasing in price, as expected. The price elasticity of the probability of adoption suggests that sensor networks are likely to diffuse at a rate somewhat greater than that of drip irrigation. In the second essay, yields, time-to-harvest, and plant quality were analyzed to measure sensor network profitability. Sensor-based irrigation was found to increase revenue by 62% and profit by 65% per year. The third essay investigates greenhouse nursery growers’ response to a quarantine imposed on the west coast of the United States from 2002 to present for the plant pathogen that causes Sudden Oak Death. I investigate whether growers choose to 1) improve their sanitation practices, which reduces the underlying risk of disease without increasing the difficulty of detecting the pathogen, 2) increase fungicide use, which also prevents disease but makes existing infections much harder to detect, or 3) change their crop composition towards more resistant species. First, a theoretical model is derived to formalize hypotheses on grower responses to the quarantine, and then these predictions are empirically tested using several public data sources. I do not find evidence that growers improve their sanitation practices in response to the quarantine. I do, however, find evidence that growers heavily increase their fungicide use in response to a quarantine policy that requires visual (as opposed to laboratory) inspection for the disease before every crop shipment, suggesting that the quarantine may have the adverse effect of making the pathogen harder to identify. I also do find evidence that growers shift away from susceptible crops and towards resistant crops.
Resumo:
Objective: Cost-effectiveness analysis of a 6-month treatment of apixaban (10 mg/12h, first 7 days; 5 mg/12h afterwards) for the treatment of the first event of venous thromboembolism (VTE) and prevention of recurrences, versus low-molecular-weight heparins/vitamin K antagonists treatment (LMWH/VKA). Material and methods: A lifetime Markov model with 13 health states was used for describing the course of the disease. Efficacy and safety data were obtained from AMPLIFY and AMPLIFY-EXT clinical trials; health outcomes were measured as life years gained (LYG) and quality-adjusted life years (QALY). The chosen perspective of this analysis has been the Spanish National Health System (NHS). Drugs, management of VTE and complications costs were obtained from several Spanish data sources (, 2014). A 3% discount rate was applied to health outcomes and costs. Univariate and probabilistic sensitivity analyses (SA) were performed in order to assess the robustness of the results. Results: Apixaban was the most effective therapy with 7.182 LYG and 5.865 QALY, versus 7.160 LYG and 5.838 QALYs with LMWH/VKA. Furthermore, apixaban had a lower total cost (13,374.70 vs 13,738.30). Probabilistic SA confirmed dominance of apixaban (led to better health outcomes with less associated costs) in 89% of the simulations. Conclusions: Apixaban 5 mg/12h versus LMWH/VKA was an efficient therapeutic strategy for the treatment and prevention of recurrences of VTE from the NHS perspective.
Resumo:
Analysis of data without labels is commonly subject to scrutiny by unsupervised machine learning techniques. Such techniques provide more meaningful representations, useful for better understanding of a problem at hand, than by looking only at the data itself. Although abundant expert knowledge exists in many areas where unlabelled data is examined, such knowledge is rarely incorporated into automatic analysis. Incorporation of expert knowledge is frequently a matter of combining multiple data sources from disparate hypothetical spaces. In cases where such spaces belong to different data types, this task becomes even more challenging. In this paper we present a novel immune-inspired method that enables the fusion of such disparate types of data for a specific set of problems. We show that our method provides a better visual understanding of one hypothetical space with the help of data from another hypothetical space. We believe that our model has implications for the field of exploratory data analysis and knowledge discovery.
Resumo:
Tese (doutorado)—Universidade de Brasília, Instituto de Ciências Sociais, Departamento de Sociologia, 2014.
Resumo:
Dissertação de mest. em Didáctica das Línguas e Culturas Modernas Especialização Inglês, Faculdade de Ciências Humanas e Sociais, Univ. do Algarve, 2003
Resumo:
A Emenda Constitucional 64/2010 garantiu Direito Humano à Alimentação como direito básico e social, alterando o Artigo 6º da Constituição Federal. O artigo analisa as significativas implicações desta alteração na gestão das políticas públicas brasileiras Ao assegurar o Direito à Alimentação como direito básico e social, a Carta constituiu um dever, ou uma obrigação positiva do Estado brasileiro. O artigo discute também o significado desta mudança para o sistema brasileiro de informações, argumentando que já existem fontes de dados e sistema de indicadores construídos para o monitoramento consistente das situações de (in)segurança alimentar e nutricional no país, restando agora ao governo federal e aos gestores do Sistema Brasileiro de Informações Estatísticas e Geográficas definir a regularidade e a frequência da aplicação e divulgação destes instrumentos. Nossa atenção se concentrará basicamente nas possibilidades de uso da Pesquisa de Orçamentos Familiares e da Pesquisa Nacional por Amostra Domiciliar como fontes de dados. _______________________________________________________________________________ ABSTRACT
Resumo:
Libraries since their inception 4000 years ago have been in a process of constant change. Although, changes were in slow motion for centuries, in the last decades, academic libraries have been continuously striving to adapt their services to the ever-changing user needs of students and academic staff. In addition, e-content revolution, technological advances, and ever-shrinking budgets have obliged libraries to efficiently allocate their limited resources among collection and services. Unfortunately, this resource allocation is a complex process due to the diversity of data sources and formats required to be analyzed prior to decision-making, as well as the lack of efficient integration methods. The main purpose of this study is to develop an integrated model that supports libraries in making optimal budgeting and resource allocation decisions among their services and collection by means of a holistic analysis. To this end, a combination of several methodologies and structured approaches is conducted. Firstly, a holistic structure and the required toolset to holistically assess academic libraries are proposed to collect and organize the data from an economic point of view. A four-pronged theoretical framework is used in which the library system and collection are analyzed from the perspective of users and internal stakeholders. The first quadrant corresponds to the internal perspective of the library system that is to analyze the library performance, and costs incurred and resources consumed by library services. The second quadrant evaluates the external perspective of the library system; user’s perception about services quality is judged in this quadrant. The third quadrant analyses the external perspective of the library collection that is to evaluate the impact of the current library collection on its users. Eventually, the fourth quadrant evaluates the internal perspective of the library collection; the usage patterns followed to manipulate the library collection are analyzed. With a complete framework for data collection, these data coming from multiple sources and therefore with different formats, need to be integrated and stored in an adequate scheme for decision support. A data warehousing approach is secondly designed and implemented to integrate, process, and store the holistic-based collected data. Ultimately, strategic data stored in the data warehouse are analyzed and implemented for different purposes including the following: 1) Data visualization and reporting is proposed to allow library managers to publish library indicators in a simple and quick manner by using online reporting tools. 2) Sophisticated data analysis is recommended through the use of data mining tools; three data mining techniques are examined in this research study: regression, clustering and classification. These data mining techniques have been applied to the case study in the following manner: predicting the future investment in library development; finding clusters of users that share common interests and similar profiles, but belong to different faculties; and predicting library factors that affect student academic performance by analyzing possible correlations of library usage and academic performance. 3) Input for optimization models, early experiences of developing an optimal resource allocation model to distribute resources among the different processes of a library system are documented in this study. Specifically, the problem of allocating funds for digital collection among divisions of an academic library is addressed. An optimization model for the problem is defined with the objective of maximizing the usage of the digital collection over-all library divisions subject to a single collection budget. By proposing this holistic approach, the research study contributes to knowledge by providing an integrated solution to assist library managers to make economic decisions based on an “as realistic as possible” perspective of the library situation.
Awake examination versus DISE for surgical decision making in patients with OSA: A systematic review
Resumo:
OBJECTIVE: Traditionally, upper airway examination is performed while the patient is awake. However, in the past two decades, drug-induced sleep endoscopy (DISE) has been used as a method of tridimensional evaluation of the upper airway during pharmacologically induced sleep. This study aimed to systematically review the evidence regarding the usefulness of DISE compared with that of traditional awake examination for surgical decision making in patients with obstructive sleep apnea (OSA). DATA SOURCES: Scopus, PubMed, and Cochrane Library databases were searched. REVIEW METHODS: Only studies with a primary objective of evaluating the usefulness of DISE for surgical decision making in patients with OSA were selected. The included studies directly compared awake examination data with DISE outcome data in terms of possible influences on surgical decision making and operation success. RESULTS: A total of eight studies with 535 patients were included in this review. Overall, the surgical treatment changed after DISE in 50.24% (standard deviation 8.4) cases. These changes were more frequently associated with structures contributing to hypopharyngeal or laryngeal obstruction. However, these differences do not automatically indicate a higher success rate. CONCLUSION: This review emphasized the direct impact of DISE compared with that of awake examination on surgical decision making in OSA patients. However, it is also clear that the available published studies lack evidence on the association between this impact and surgical outcomes