973 resultados para correlated data


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Part I of the present work we describe the viscosity measurements performed on tris(2-ethylhexyl) trimellitate or 1,2,4-benzenetricarboxylic acid, tris(2-ethylhexyl) ester (TOTM) up to 65 MPa and at six temperatures from (303 to 373)K, using a new vibrating-wire instrument. The main aim is to contribute to the proposal of that liquid as a potential reference fluid for high viscosity, high pressure and high temperature. The present Part II is dedicated to report the density measurements of TOTM necessary, not only to compute the viscosity data presented in Part I, but also as complementary data for the mentioned proposal. The present density measurements were obtained using a vibrating U-tube densimeter, model DMA HP, using model DMA5000 as a reading unit, both instruments from Anton Paar GmbH. The measurements were performed along five isotherms from (293 to 373)K and at eleven different pressures up to 68 MPa. As far as the authors are aware, the viscosity and density results are the first, above atmospheric pressure, to be published for TOTM. Due to TOTM's high viscosity, its density data were corrected for the viscosity effect on the U-tube density measurements. This effect was estimated using two Newtonian viscosity standard liquids, 20 AW and 200 GW. The density data were correlated with temperature and pressure using a modified Tait equation. The expanded uncertainty of the present density results is estimated as +/- 0.2% at a 95% confidence level. Those results were correlated with temperature and pressure by a modified Tait equation, with deviations within +/- 0.25%. Furthermore, the isothermal compressibility, K-T, and the isobaric thermal expansivity, alpha(p), were obtained by derivation of the modified Tait equation used for correlating the density data. The corresponding uncertainties, at a 95% confidence level, are estimated to be less than +/- 1.5% and +/- 1.2%, respectively. No isobaric thermal expansivity and isothermal compressibility for TOTM were found in the literature. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To evaluate the individual and contextual determinants of the use of health care services in the metropolitan region of Sao Paulo.METHODS Data from the Sao Paulo Megacity study – the Brazilian version of the World Mental Health Survey multicenter study – were used. A total of 3,588 adults living in 69 neighborhoods in the metropolitan region of Sao Paulo, SP, Southeastern Brazil, including 38 municipalities and 31 neighboring districts, were selected using multistratified sampling of the non-institutionalized population. Multilevel Bayesian logistic models were adjusted to identify the individual and contextual determinants of the use of health care services in the past 12 months and presence of a regular physician for routine care.RESULTS The contextual characteristics of the place of residence (income inequality, violence, and median income) showed no significant correlation (p > 0.05) with the use of health care services or with the presence of a regular physician for routine care. The only exception was the negative correlation between living in areas with high income inequality and presence of a regular physician (OR: 0.77; 95%CI 0.60;0.99) after controlling for individual characteristics. The study revealed a strong and consistent correlation between individual characteristics (mainly education and possession of health insurance), use of health care services, and presence of a regular physician. Presence of chronic and mental illnesses was strongly correlated with the use of health care services in the past year (regardless of the individual characteristics) but not with the presence of a regular physician.CONCLUSIONS Individual characteristics including higher education and possession of health insurance were important determinants of the use of health care services in the metropolitan area of Sao Paulo. A better understanding of these determinants is essential for the development of public policies that promote equitable use of health care services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To analyze the influence from context characteristics in the control of tuberculosis in prisons, and the influence from the program implementation degrees in observed effects.METHODS A multiple case study, with a qualitative approach, conducted in the prison systems of two Brazilian states in 2011 and 2012. Two prisons were analyzed in each state, and a prison hospital was analyzed in one of them. The data were submitted to a content analysis, which was based on external, political-organizational, implementation, and effect dimensions. Contextual factors and the ones in the program organization were correlated. The independent variable was the program implementation degree and the dependent one, the effects from the Tuberculosis Control Program in prisons.RESULTS The context with the highest sociodemographic vulnerability, the highest incidence rate of tuberculosis, and the smallest amount of available resources were associated with the low implementation degree of the program. The results from tuberculosis treatment in the prison system were better where the program had already been partially implemented than in the case with low implementation degree in both cases.CONCLUSIONS The implementation degree and its contexts – external and political-organizational dimensions – simultaneously contribute to the effects that are observed in the control of tuberculosis in analyzed prisons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT OBJECTIVE To develop an assessment tool to evaluate the efficiency of federal university general hospitals. METHODS Data envelopment analysis, a linear programming technique, creates a best practice frontier by comparing observed production given the amount of resources used. The model is output-oriented and considers variable returns to scale. Network data envelopment analysis considers link variables belonging to more than one dimension (in the model, medical residents, adjusted admissions, and research projects). Dynamic network data envelopment analysis uses carry-over variables (in the model, financing budget) to analyze frontier shift in subsequent years. Data were gathered from the information system of the Brazilian Ministry of Education (MEC), 2010-2013. RESULTS The mean scores for health care, teaching and research over the period were 58.0%, 86.0%, and 61.0%, respectively. In 2012, the best performance year, for all units to reach the frontier it would be necessary to have a mean increase of 65.0% in outpatient visits; 34.0% in admissions; 12.0% in undergraduate students; 13.0% in multi-professional residents; 48.0% in graduate students; 7.0% in research projects; besides a decrease of 9.0% in medical residents. In the same year, an increase of 0.9% in financing budget would be necessary to improve the care output frontier. In the dynamic evaluation, there was progress in teaching efficiency, oscillation in medical care and no variation in research. CONCLUSIONS The proposed model generates public health planning and programming parameters by estimating efficiency scores and making projections to reach the best practice frontier.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the calculation of derivatives of fractional order for non-smooth data. The noise is avoided by adopting an optimization formulation using genetic algorithms (GA). Given the flexibility of the evolutionary schemes, a hierarchical GA composed by a series of two GAs, each one with a distinct fitness function, is established.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The morpho-structural evolution of oceanic islands results from competition between volcano growth and partial destruction by mass-wasting processes. We present here a multi-disciplinary study of the successive stages of development of Faial (Azores) during the last 1 Myr. Using high-resolution digital elevation model (DEM), and new K/Ar, tectonic, and magnetic data, we reconstruct the rapidly evolving topography at successive stages, in response to complex interactions between volcanic construction and mass wasting, including the development of a graben. We show that: (1) sub-aerial evolution of the island first involved the rapid growth of a large elongated volcano at ca. 0.85 Ma, followed by its partial destruction over half a million years; (2) beginning about 360 ka a new small edifice grew on the NE of the island, and was subsequently cut by normal faults responsible for initiation of the graben; (3) after an apparent pause of ca. 250 kyr, the large Central Volcano (CV) developed on the western side of the island at ca 120 ka, accumulating a thick pile of lava flows in less than 20 kyr, which were partly channelized within the graben; (4) the period between 120 ka and 40 ka is marked by widespread deformation at the island scale, including westward propagation of faulting and associated erosion of the graben walls, which produced sedimentary deposits; subsequent growth of the CV at 40 ka was then constrained within the graben, with lava flowing onto the sediments up to the eastern shore; (5) the island evolution during the Holocene involves basaltic volcanic activity along the main southern faults and pyroclastic eruptions associated with the formation of a caldera volcano-tectonic depression. We conclude that the whole evolution of Faial Island has been characterized by successive short volcanic pulses probably controlled by brief episodes of regional deformation. Each pulse has been separated by considerable periods of volcanic inactivity during which the Faial graben gradually developed. We propose that the volume loss associated with sudden magma extraction from a shallow reservoir in different episodes triggered incremental downward graben movement, as observed historically, when immediate vertical collapse of up to 2 m was observed along the western segments of the graben at the end of the Capelinhos eruptive crises (1957-58).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conferência: CONTROLO’2012 - 16-18 July 2012 - Funchal

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data analytic applications are characterized by large data sets that are subject to a series of processing phases. Some of these phases are executed sequentially but others can be executed concurrently or in parallel on clusters, grids or clouds. The MapReduce programming model has been applied to process large data sets in cluster and cloud environments. For developing an application using MapReduce there is a need to install/configure/access specific frameworks such as Apache Hadoop or Elastic MapReduce in Amazon Cloud. It would be desirable to provide more flexibility in adjusting such configurations according to the application characteristics. Furthermore the composition of the multiple phases of a data analytic application requires the specification of all the phases and their orchestration. The original MapReduce model and environment lacks flexible support for such configuration and composition. Recognizing that scientific workflows have been successfully applied to modeling complex applications, this paper describes our experiments on implementing MapReduce as subworkflows in the AWARD framework (Autonomic Workflow Activities Reconfigurable and Dynamic). A text mining data analytic application is modeled as a complex workflow with multiple phases, where individual workflow nodes support MapReduce computations. As in typical MapReduce environments, the end user only needs to define the application algorithms for input data processing and for the map and reduce functions. In the paper we present experimental results when using the AWARD framework to execute MapReduce workflows deployed over multiple Amazon EC2 (Elastic Compute Cloud) instances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Feature selection is a central problem in machine learning and pattern recognition. On large datasets (in terms of dimension and/or number of instances), using search-based or wrapper techniques can be cornputationally prohibitive. Moreover, many filter methods based on relevance/redundancy assessment also take a prohibitively long time on high-dimensional. datasets. In this paper, we propose efficient unsupervised and supervised feature selection/ranking filters for high-dimensional datasets. These methods use low-complexity relevance and redundancy criteria, applicable to supervised, semi-supervised, and unsupervised learning, being able to act as pre-processors for computationally intensive methods to focus their attention on smaller subsets of promising features. The experimental results, with up to 10(5) features, show the time efficiency of our methods, with lower generalization error than state-of-the-art techniques, while being dramatically simpler and faster.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work describes the impact of different teachers’ approaches in using Moodle, for supporting their courses, at the Polytechnic of Porto - School of Engineering. The study covers five different courses, from different degrees and different years, and includes a number of Moodle resources especially supporting laboratory classes. These and other active resources are particularly analyzed in order to evaluate students’ adherence to them. One particular course includes a number of remote experiments, made available through VISIR (Virtual Instrument Systems in Reality) and directly accessible through links included in the Moodle course page. The collected data have been correlated with students’ classifications in the lab component and in the exam, each one weighting 50% of their final marks. This analysis benefited from the existence of different teachers’ approaches, which resulted in a diversity of Moodle-supported environments. Conclusions point to the existence of a positive correlation factor between the number of Moodle accesses and the final exam grade, although the quality of the resources made available by the teachers seems to be preponderant over its quantity. In addition, different students perspectives were found regarding active resources: while some seem to encourage students to participate (for instance online quiz or online reports), others, more demanding, are unable to stimulate the majority of them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work extends a recent comparative study covering four different courses lectured at the Polytechnic of Porto - School of Engineering, in respect to the usage of a particular Learning Management System, i.e. Moodle, and its impact on students' results. A fifth course, which includes a number of resources especially supporting laboratory classes, is now added to the analysis. This particular course includes a number of remote experiments, made available through VISIR (Virtual Instrument Systems in Reality) and directly accessible through links included in the Moodle course page. We have analyzed the students' behavior in following these links and in effectively running experiments in VISIR (and also using other lab related resources, in Moodle). This data have been correlated with students' classifications in the lab component and in the exam, each one weighting 50% of their final marks. We aimed to compare students' performance in a richly Moodle-supported environment (with lab component) and in a poorly Moodle-supported environment (with only theoretical component). This question followed from conclusions drawn in the above referred comparative study, where it was shown that even though a positive correlation factor existed between the number of Moodle accesses and the final exam grade obtained by each student, its explanation behind was not straightforward, as the quality of the resources was preponderant over its quantity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to contribute to the assessment of exposure levels of ultrafine particles in the urban environment of Lisbon, Portugal, due to automobile traffic, by monitoring lung deposited alveolar surface area (resulting from exposure to ultrafine particles) in a major avenue leading to the town center during late spring, as well as in indoor buildings facing it. Data revealed differentiated patterns for week days and weekends, consistent with PM2.5 and PM10 patterns currently monitored by air quality stations in Lisbon. The observed ultrafine particulate levels may be directly correlated with fluxes in automobile traffic. During a typical week, amounts of ultrafine particles per alveolar deposited surface area varied between 35 and 89.2 mu m2/cm3, which are comparable with levels reported for other towns in Germany and the United States. The measured values allowed for determination of the number of ultrafine particles per cubic centimeter, which are comparable to levels reported for Madrid and Brisbane. In what concerns outdoor/indoor levels, we observed higher levels (32 to 63%) outdoors, which is somewhat lower than levels observed in houses in Ontario.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of the traditional software and database development approaches tend to be serial, not evolutionary and certainly not agile, especially on data-oriented aspects. Most of the more commonly used methodologies are strict, meaning they’re composed by several stages each with very specific associated tasks. A clear example is the Rational Unified Process (RUP), divided into Business Modeling, Requirements, Analysis & Design, Implementation, Testing and Deployment. But what happens when the needs of a well design and structured plan, meet the reality of a small starting company that aims to build an entire user experience solution. Here resource control and time productivity is vital, requirements are in constant change, and so is the product itself. In order to succeed in this environment a highly collaborative and evolutionary development approach is mandatory. The implications of constant changing requirements imply an iterative development process. Project focus is on Data Warehouse development and business modeling. This area is usually a tricky one. Business knowledge is part of the enterprise, how they work, their goals, what is relevant for analyses are internal business processes. Throughout this document it will be explained why Agile Modeling development was chosen. How an iterative and evolutionary methodology, allowed for reasonable planning and documentation while permitting development flexibility, from idea to product. More importantly how it was applied on the development of a Retail Focused Data Warehouse. A productized Data Warehouse built on the knowledge of not one but several client needs. One that aims not just to store usual business areas but create an innovative sets of business metrics by joining them with store environment analysis, converting Business Intelligence into Actionable Business Intelligence.