975 resultados para data management policies


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The growing accumulation of people in urban centers caused chronic problems of the cities to begin to take an increasingly unsustainable. Primarily related to lack of infrastructure coupled with sanitation and lack of investment in critical sectors such as health, education, housing and transportation, these problems start to deteriorate markedly the quality of life of city dwellers and put into test management policies of the spaces urbanized. To reverse this situation, shows is essential to the use of tools (highlighting this harvest rates and environmental indicators) that help in assessing the current conditions and may assist in predicting future scenarios. From the information listed above, now put the research seeks to present an index called ISBA Environmental (Sanitation Index) which looks at the four urban systems (water, sewer, solid waste and urban drainage) from the viewpoint of application in a geographical cutout specific - in this case the Drainage Basin XII, defined by the Plan of Urban Drainage Stormwater in the city of Natal, capital of Rio Grande do Norte. This index, together with analysis of other factors sought to trace the current conditions of the basin and thus, assist in proposing the best solutions. For the preparation of the index was applied a questionnaire with a sample of 384 (three hundred eighty-four) households that aimed to study two variables: access to services and satisfaction of the population in relation to these. The ISBA has shown that the system is the most deficient collection and disposal of effluents (ICE = 47.66%), followed by the drainage of rainwater (IDAP = 54.17%), water supply (AAI = 61, 36) and solid waste collection (IRS = 78.28). With the ISBA was possible to verify that the qualitative data shows whose subjectivity is evident (as is the case of user satisfaction) can be of great importance when an assessment, since we obtained the correlation coefficient between the variables "Access" and " Satisfaction "equal to 0.8234, showing a strong correlation between the existence / quality of service offered and the impressions of the population that receives them

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Software Transactional Memory (STM) systems have poor performance under high contention scenarios. Since many transactions compete for the same data, most of them are aborted, wasting processor runtime. Contention management policies are typically used to avoid that, but they are passive approaches as they wait for an abort to happen so they can take action. More proactive approaches have emerged, trying to predict when a transaction is likely to abort so its execution can be delayed. Such techniques are limited, as they do not replace the doomed transaction by another or, when they do, they rely on the operating system for that, having little or no control on which transaction should run. In this paper we propose LUTS, a Lightweight User-Level Transaction Scheduler, which is based on an execution context record mechanism. Unlike other techniques, LUTS provides the means for selecting another transaction to run in parallel, thus improving system throughput. Moreover, it avoids most of the issues caused by pseudo parallelism, as it only launches as many system-level threads as the number of available processor cores. We discuss LUTS design and present three conflict-avoidance heuristics built around LUTS scheduling capabilities. Experimental results, conducted with STMBench7 and STAMP benchmark suites, show LUTS efficiency when running high contention applications and how conflict-avoidance heuristics can improve STM performance even more. In fact, our transaction scheduling techniques are capable of improving program performance even in overloaded scenarios. © 2011 Springer-Verlag.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A presente pesquisa trata da gestão das políticas do ensino médio integrado e objetiva analisa-la na rede de escolas tecnológicas do Pará – EETEPA no município de Belém, e, em particular na Escola de Educação Tecnológica Professor Anísio Teixeira, com vistas a verificar de que modo a gestão tem contribuído para a implementação dessa política. A pesquisa foi desenvolvida por meio de um estudo de caso numa perspectiva crítica, tendo como instrumento de coleta de dados a pesquisa documental e entrevistas semi-estruturadas. Atualmente, no Brasil, o ensino médio tem sido objeto de transformações por meio de diversas políticas, dentre as quais destacamos a política de ensino médio integrado, cuja proposta é articular a educação profissional técnica de nível médio e o ensino médio num processo formativo que possibilite o acesso ao conhecimento científico e profissional. Uma das questões principais que essa pesquisa tenciona responder é se a gestão tem contribuído para a implantação das políticas de formação profissional integrada ao ensino médio na rede de escolas de educação tecnológica do Pará, no município de Belém. Acreditamos que a gestão é um dos meios pelos quais as ações em busca de melhorias e transformações são materializadas no interior da escola. É nesse contexto que nossa pesquisa identificou que o modelo de gestão que vem sendo praticado pela Secretaria de Educação do Estado do Pará, através da rede EETEPA é unilateral, e, atualmente, não tem conseguido mobilizar os princípios do ensino integrado, diferente da escola lócus da pesquisa que, por meio do exercício da gestão democrática vem tentando consolidar a política de Ensino Médio Integrado, mesmo diante de um quadro de “perseguição” ao corpo técnico e docente da Escola e poucos incentivos. Os aspectos que puderam ser constatados na conclusão da pesquisa evidenciam que todo o fundamento teórico dessa política está sustentada no referencial marxiano e marxista, e que a gestão democrática por meio dos seus princípios foram elementos importantes no momento de implementação da política e contribuíram na construção de sua consolidação, mas não tem sido o suficiente para a sua manutenção, uma vez que a gestão da rede EETEPA que parte de uma perspectiva unilateral, tem prejudicado as ações e projetos de Ensino Médio Integrado no “chão da escola”.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The objective of the study was to estimate heritability for calving interval (CI) and age at first calving (AFC) and also calculate repeatability for CI in buffaloes using Bayesian inference. The Brazilian Buffaloes Genetic Improvement Program provided the database. Data consists on information from 628 females and four different herds, born between 1980 and 2003. In order to estimate the variance, univariate analyses were performed employing Gibbs sampler procedure included in the MTGSAM software. The model for CI included the random effects direct additive and permanent environment factors, and the fixed effects of contemporary groups and calving orders. The model for AFC included the direct additive random effect and contemporary groups as a fixed effect. The convergence diagnosis was obtained using Geweke that was implemented through the Bayesian Output Analysis package in R software. The estimated averages were 433.2 days and 36.7months for CI and AFC, respectively. The means, medians and modes for the calculated heritability coefficients were similar. The heritability coefficients were 0.10 and 0.42 for CI and AFC respectively, with a posteriori marginal density that follows a normal distribution for both traits. The repeatability for CI was 0.13. The low heritability estimated for CI indicates that the variation in this trait is, to a large extent, influenced by environmental factors such as herd management policies. The age at first calving has clear potential for yield improvement through direct selection in these animals.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Accuracy in physical inventory process is essential to develop anefficient inventorycontrol. This ensures the availability of products and ensure that the information contained in the information systems areinaccordancewith the reality of inventories. The inventory management policies of the company aim ofthis study established that all the materials in their inventory must becounted, which is proving a challenge. Therefore, the aim of thiswork is to identify the critical inventories and analyze them, seeking flaws and possible improvements in the inventory count process. For this, we used the tools of quality management, such as Pareto and Cause and Effect Diagrams in an action research. The results show that the stocks of finished products are critical in volume and can be hampered by a lack of training and personnel trained in performing the process, as well as limitations in ERP used. With the actions taken against these problems, we could notice an improvement in the process, because the data collection and processing of data has become easier, besides the occurrence of errors decreased. In addition, there was an acceleration in meeting targets compared to the previous period

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Pós-graduação em Psicologia - FCLAS

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nowadays the companies generate great amount of data from different sources, however some of them produce more data than they can analyze. Big Data is a set of data that grows very fast, collected several times during a short period of time. This work focus on the importance of the correct management of Big Data in an industrial plant. Through a case study based on a company that belongs to the pulp and paper area, the problems resolutions are going to be presented with the usage of appropriate data management. In the final chapters, the results achieved by the company are discussed, showing how the correct choice of data to be monitored and analyzed brought benefits to the company, also best practices will be recommended for the Big Data management

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nowadays the companies generate great amount of data from different sources, however some of them produce more data than they can analyze. Big Data is a set of data that grows very fast, collected several times during a short period of time. This work focus on the importance of the correct management of Big Data in an industrial plant. Through a case study based on a company that belongs to the pulp and paper area, the problems resolutions are going to be presented with the usage of appropriate data management. In the final chapters, the results achieved by the company are discussed, showing how the correct choice of data to be monitored and analyzed brought benefits to the company, also best practices will be recommended for the Big Data management

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Large scale wireless adhoc networks of computers, sensors, PDAs etc. (i.e. nodes) are revolutionizing connectivity and leading to a paradigm shift from centralized systems to highly distributed and dynamic environments. An example of adhoc networks are sensor networks, which are usually composed by small units able to sense and transmit to a sink elementary data which are successively processed by an external machine. Recent improvements in the memory and computational power of sensors, together with the reduction of energy consumptions, are rapidly changing the potential of such systems, moving the attention towards datacentric sensor networks. A plethora of routing and data management algorithms have been proposed for the network path discovery ranging from broadcasting/floodingbased approaches to those using global positioning systems (GPS). We studied WGrid, a novel decentralized infrastructure that organizes wireless devices in an adhoc manner, where each node has one or more virtual coordinates through which both message routing and data management occur without reliance on either flooding/broadcasting operations or GPS. The resulting adhoc network does not suffer from the deadend problem, which happens in geographicbased routing when a node is unable to locate a neighbor closer to the destination than itself. WGrid allow multidimensional data management capability since nodes' virtual coordinates can act as a distributed database without needing neither special implementation or reorganization. Any kind of data (both single and multidimensional) can be distributed, stored and managed. We will show how a location service can be easily implemented so that any search is reduced to a simple query, like for any other data type. WGrid has then been extended by adopting a replication methodology. We called the resulting algorithm WRGrid. Just like WGrid, WRGrid acts as a distributed database without needing neither special implementation nor reorganization and any kind of data can be distributed, stored and managed. We have evaluated the benefits of replication on data management, finding out, from experimental results, that it can halve the average number of hops in the network. The direct consequence of this fact are a significant improvement on energy consumption and a workload balancing among sensors (number of messages routed by each node). Finally, thanks to the replications, whose number can be arbitrarily chosen, the resulting sensor network can face sensors disconnections/connections, due to failures of sensors, without data loss. Another extension to {WGrid} is {W*Grid} which extends it by strongly improving network recovery performance from link and/or device failures that may happen due to crashes or battery exhaustion of devices or to temporary obstacles. W*Grid guarantees, by construction, at least two disjoint paths between each couple of nodes. This implies that the recovery in W*Grid occurs without broadcasting transmissions and guaranteeing robustness while drastically reducing the energy consumption. An extensive number of simulations shows the efficiency, robustness and traffic road of resulting networks under several scenarios of device density and of number of coordinates. Performance analysis have been compared to existent algorithms in order to validate the results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The term Ambient Intelligence (AmI) refers to a vision on the future of the information society where smart, electronic environment are sensitive and responsive to the presence of people and their activities (Context awareness). In an ambient intelligence world, devices work in concert to support people in carrying out their everyday life activities, tasks and rituals in an easy, natural way using information and intelligence that is hidden in the network connecting these devices. This promotes the creation of pervasive environments improving the quality of life of the occupants and enhancing the human experience. AmI stems from the convergence of three key technologies: ubiquitous computing, ubiquitous communication and natural interfaces. Ambient intelligent systems are heterogeneous and require an excellent cooperation between several hardware/software technologies and disciplines, including signal processing, networking and protocols, embedded systems, information management, and distributed algorithms. Since a large amount of fixed and mobile sensors embedded is deployed into the environment, the Wireless Sensor Networks is one of the most relevant enabling technologies for AmI. WSN are complex systems made up of a number of sensor nodes which can be deployed in a target area to sense physical phenomena and communicate with other nodes and base stations. These simple devices typically embed a low power computational unit (microcontrollers, FPGAs etc.), a wireless communication unit, one or more sensors and a some form of energy supply (either batteries or energy scavenger modules). WNS promises of revolutionizing the interactions between the real physical worlds and human beings. Low-cost, low-computational power, low energy consumption and small size are characteristics that must be taken into consideration when designing and dealing with WSNs. To fully exploit the potential of distributed sensing approaches, a set of challengesmust be addressed. Sensor nodes are inherently resource-constrained systems with very low power consumption and small size requirements which enables than to reduce the interference on the physical phenomena sensed and to allow easy and low-cost deployment. They have limited processing speed,storage capacity and communication bandwidth that must be efficiently used to increase the degree of local ”understanding” of the observed phenomena. A particular case of sensor nodes are video sensors. This topic holds strong interest for a wide range of contexts such as military, security, robotics and most recently consumer applications. Vision sensors are extremely effective for medium to long-range sensing because vision provides rich information to human operators. However, image sensors generate a huge amount of data, whichmust be heavily processed before it is transmitted due to the scarce bandwidth capability of radio interfaces. In particular, in video-surveillance, it has been shown that source-side compression is mandatory due to limited bandwidth and delay constraints. Moreover, there is an ample opportunity for performing higher-level processing functions, such as object recognition that has the potential to drastically reduce the required bandwidth (e.g. by transmitting compressed images only when something ‘interesting‘ is detected). The energy cost of image processing must however be carefully minimized. Imaging could play and plays an important role in sensing devices for ambient intelligence. Computer vision can for instance be used for recognising persons and objects and recognising behaviour such as illness and rioting. Having a wireless camera as a camera mote opens the way for distributed scene analysis. More eyes see more than one and a camera system that can observe a scene from multiple directions would be able to overcome occlusion problems and could describe objects in their true 3D appearance. In real-time, these approaches are a recently opened field of research. In this thesis we pay attention to the realities of hardware/software technologies and the design needed to realize systems for distributed monitoring, attempting to propose solutions on open issues and filling the gap between AmI scenarios and hardware reality. The physical implementation of an individual wireless node is constrained by three important metrics which are outlined below. Despite that the design of the sensor network and its sensor nodes is strictly application dependent, a number of constraints should almost always be considered. Among them: • Small form factor to reduce nodes intrusiveness. • Low power consumption to reduce battery size and to extend nodes lifetime. • Low cost for a widespread diffusion. These limitations typically result in the adoption of low power, low cost devices such as low powermicrocontrollers with few kilobytes of RAMand tenth of kilobytes of program memory with whomonly simple data processing algorithms can be implemented. However the overall computational power of the WNS can be very large since the network presents a high degree of parallelism that can be exploited through the adoption of ad-hoc techniques. Furthermore through the fusion of information from the dense mesh of sensors even complex phenomena can be monitored. In this dissertation we present our results in building several AmI applications suitable for a WSN implementation. The work can be divided into two main areas:Low Power Video Sensor Node and Video Processing Alghoritm and Multimodal Surveillance . Low Power Video Sensor Nodes and Video Processing Alghoritms In comparison to scalar sensors, such as temperature, pressure, humidity, velocity, and acceleration sensors, vision sensors generate much higher bandwidth data due to the two-dimensional nature of their pixel array. We have tackled all the constraints listed above and have proposed solutions to overcome the current WSNlimits for Video sensor node. We have designed and developed wireless video sensor nodes focusing on the small size and the flexibility of reuse in different applications. The video nodes target a different design point: the portability (on-board power supply, wireless communication), a scanty power budget (500mW),while still providing a prominent level of intelligence, namely sophisticated classification algorithmand high level of reconfigurability. We developed two different video sensor node: The device architecture of the first one is based on a low-cost low-power FPGA+microcontroller system-on-chip. The second one is based on ARM9 processor. Both systems designed within the above mentioned power envelope could operate in a continuous fashion with Li-Polymer battery pack and solar panel. Novel low power low cost video sensor nodes which, in contrast to sensors that just watch the world, are capable of comprehending the perceived information in order to interpret it locally, are presented. Featuring such intelligence, these nodes would be able to cope with such tasks as recognition of unattended bags in airports, persons carrying potentially dangerous objects, etc.,which normally require a human operator. Vision algorithms for object detection, acquisition like human detection with Support Vector Machine (SVM) classification and abandoned/removed object detection are implemented, described and illustrated on real world data. Multimodal surveillance: In several setup the use of wired video cameras may not be possible. For this reason building an energy efficient wireless vision network for monitoring and surveillance is one of the major efforts in the sensor network community. Energy efficiency for wireless smart camera networks is one of the major efforts in distributed monitoring and surveillance community. For this reason, building an energy efficient wireless vision network for monitoring and surveillance is one of the major efforts in the sensor network community. The Pyroelectric Infra-Red (PIR) sensors have been used to extend the lifetime of a solar-powered video sensor node by providing an energy level dependent trigger to the video camera and the wireless module. Such approach has shown to be able to extend node lifetime and possibly result in continuous operation of the node.Being low-cost, passive (thus low-power) and presenting a limited form factor, PIR sensors are well suited for WSN applications. Moreover techniques to have aggressive power management policies are essential for achieving long-termoperating on standalone distributed cameras needed to improve the power consumption. We have used an adaptive controller like Model Predictive Control (MPC) to help the system to improve the performances outperforming naive power management policies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

L’esperimento CMS a LHC ha raccolto ingenti moli di dati durante Run-1, e sta sfruttando il periodo di shutdown (LS1) per evolvere il proprio sistema di calcolo. Tra i possibili miglioramenti al sistema, emergono ampi margini di ottimizzazione nell’uso dello storage ai centri di calcolo di livello Tier-2, che rappresentano - in Worldwide LHC Computing Grid (WLCG)- il fulcro delle risorse dedicate all’analisi distribuita su Grid. In questa tesi viene affrontato uno studio della popolarità dei dati di CMS nell’analisi distribuita su Grid ai Tier-2. Obiettivo del lavoro è dotare il sistema di calcolo di CMS di un sistema per valutare sistematicamente l’ammontare di spazio disco scritto ma non acceduto ai centri Tier-2, contribuendo alla costruzione di un sistema evoluto di data management dinamico che sappia adattarsi elasticamente alle diversi condizioni operative - rimuovendo repliche dei dati non necessarie o aggiungendo repliche dei dati più “popolari” - e dunque, in ultima analisi, che possa aumentare l’“analysis throughput” complessivo. Il Capitolo 1 fornisce una panoramica dell’esperimento CMS a LHC. Il Capitolo 2 descrive il CMS Computing Model nelle sue generalità, focalizzando la sua attenzione principalmente sul data management e sulle infrastrutture ad esso connesse. Il Capitolo 3 descrive il CMS Popularity Service, fornendo una visione d’insieme sui servizi di data popularity già presenti in CMS prima dell’inizio di questo lavoro. Il Capitolo 4 descrive l’architettura del toolkit sviluppato per questa tesi, ponendo le basi per il Capitolo successivo. Il Capitolo 5 presenta e discute gli studi di data popularity condotti sui dati raccolti attraverso l’infrastruttura precedentemente sviluppata. L’appendice A raccoglie due esempi di codice creato per gestire il toolkit attra- verso cui si raccolgono ed elaborano i dati.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

SMARTDIAB is a platform designed to support the monitoring, management, and treatment of patients with type 1 diabetes mellitus (T1DM), by combining state-of-the-art approaches in the fields of database (DB) technologies, communications, simulation algorithms, and data mining. SMARTDIAB consists mainly of two units: 1) the patient unit (PU); and 2) the patient management unit (PMU), which communicate with each other for data exchange. The PMU can be accessed by the PU through the internet using devices, such as PCs/laptops with direct internet access or mobile phones via a Wi-Fi/General Packet Radio Service access network. The PU consists of an insulin pump for subcutaneous insulin infusion to the patient and a continuous glucose measurement system. The aforementioned devices running a user-friendly application gather patient's related information and transmit it to the PMU. The PMU consists of a diabetes data management system (DDMS), a decision support system (DSS) that provides risk assessment for long-term diabetes complications, and an insulin infusion advisory system (IIAS), which reside on a Web server. The DDMS can be accessed from both medical personnel and patients, with appropriate security access rights and front-end interfaces. The DDMS, apart from being used for data storage/retrieval, provides also advanced tools for the intelligent processing of the patient's data, supporting the physician in decision making, regarding the patient's treatment. The IIAS is used to close the loop between the insulin pump and the continuous glucose monitoring system, by providing the pump with the appropriate insulin infusion rate in order to keep the patient's glucose levels within predefined limits. The pilot version of the SMARTDIAB has already been implemented, while the platform's evaluation in clinical environment is being in progress.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Jahnke and Asher explore workflows and methodologies at a variety of academic data curation sites, and Keralis delves into the academic milieu of library and information schools that offer instruction in data curation. Their conclusions point to the urgent need for a reliable and increasingly sophisticated professional cohort to support data-intensive research in our colleges, universities, and research centers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: To describe the electronic medical databases used in antiretroviral therapy (ART) programmes in lower-income countries and assess the measures such programmes employ to maintain and improve data quality and reduce the loss of patients to follow-up. METHODS: In 15 countries of Africa, South America and Asia, a survey was conducted from December 2006 to February 2007 on the use of electronic medical record systems in ART programmes. Patients enrolled in the sites at the time of the survey but not seen during the previous 12 months were considered lost to follow-up. The quality of the data was assessed by computing the percentage of missing key variables (age, sex, clinical stage of HIV infection, CD4+ lymphocyte count and year of ART initiation). Associations between site characteristics (such as number of staff members dedicated to data management), measures to reduce loss to follow-up (such as the presence of staff dedicated to tracing patients) and data quality and loss to follow-up were analysed using multivariate logit models. FINDINGS: Twenty-one sites that together provided ART to 50 060 patients were included (median number of patients per site: 1000; interquartile range, IQR: 72-19 320). Eighteen sites (86%) used an electronic database for medical record-keeping; 15 (83%) such sites relied on software intended for personal or small business use. The median percentage of missing data for key variables per site was 10.9% (IQR: 2.0-18.9%) and declined with training in data management (odds ratio, OR: 0.58; 95% confidence interval, CI: 0.37-0.90) and weekly hours spent by a clerk on the database per 100 patients on ART (OR: 0.95; 95% CI: 0.90-0.99). About 10 weekly hours per 100 patients on ART were required to reduce missing data for key variables to below 10%. The median percentage of patients lost to follow-up 1 year after starting ART was 8.5% (IQR: 4.2-19.7%). Strategies to reduce loss to follow-up included outreach teams, community-based organizations and checking death registry data. Implementation of all three strategies substantially reduced losses to follow-up (OR: 0.17; 95% CI: 0.15-0.20). CONCLUSION: The quality of the data collected and the retention of patients in ART treatment programmes are unsatisfactory for many sites involved in the scale-up of ART in resource-limited settings, mainly because of insufficient staff trained to manage data and trace patients lost to follow-up.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ecosystem management policies increasingly emphasize provision of multiple, as opposed to single, ecosystem services. Management for such "multifunctionality" has stimulated research into the role that biodiversity plays in providing desired rates of multiple ecosystem processes. Positive effects of biodiversity on indices of multifunctionality are consistently found, primarily because species that are redundant for one ecosystem process under a given set of environmental conditions play a distinct role under different conditions or in the provision of another ecosystem process. Here we show that the positive effects of diversity (specifically community composition) on multifunctionality indices can also arise from a statistical fallacy analogous to Simpson's paradox (where aggregating data obscures causal relationships). We manipulated soil faunal community composition in combination with nitrogen fertilization of model grassland ecosystems and repeatedly measured five ecosystem processes related to plant productivity, carbon storage, and nutrient turnover. We calculated three common multifunctionality indices based on these processes and found that the functional complexity of the soil communities had a consistent positive effect on the indices. However, only two of the five ecosystem processes also responded positively to increasing complexity, whereas the other three responded neutrally or negatively. Furthermore, none of the individual processes responded to both the complexity and the nitrogen manipulations in a manner consistent with the indices. Our data show that multifunctionality indices can obscure relationships that exist between communities and key ecosystem processes, leading us to question their use in advancing theoretical understanding-and in management decisions-about how biodiversity is related to the provision of multiple ecosystem services.