897 resultados para Home monitoring system
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
In high speed manufacturing systems, continuous operation is desirable, with minimal disruption for repairs and service. An intelligent diagnostic monitoring system, designed to detect developing faults before catastrophic failure, or prior to undesirable reduction in output quality, is a good means of achieving this. Artificial neural networks have already been found to be of value in fault diagnosis of machinery. The aim here is to provide a system capable of detecting a number of faults, in order that maintenance can be scheduled in advance of sudden failure, and to reduce the necessity to replace parts at intervals based on mean time between failures. Instead, parts will need to be replaced only when necessary. Analysis of control information in the form of position error data from two servomotors is described.
Resumo:
Elephant poaching and the ivory trade remain high on the agenda at meetings of the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). Well-informed debates require robust estimates of trends, the spatial distribution of poaching, and drivers of poaching. We present an analysis of trends and drivers of an indicator of elephant poaching of all elephant species. The site-based monitoring system known as Monitoring the Illegal Killing of Elephants (MIKE), set up by the 10th Conference of the Parties of CITES in 1997, produces carcass encounter data reported mainly by anti-poaching patrols. Data analyzed were site by year totals of 6,337 carcasses from 66 sites in Africa and Asia from 2002–2009. Analysis of these observational data is a serious challenge to traditional statistical methods because of the opportunistic and non-random nature of patrols, and the heterogeneity across sites. Adopting a Bayesian hierarchical modeling approach, we used the proportion of carcasses that were illegally killed (PIKE) as a poaching index, to estimate the trend and the effects of site- and country-level factors associated with poaching. Important drivers of illegal killing that emerged at country level were poor governance and low levels of human development, and at site level, forest cover and area of the site in regions where human population density is low. After a drop from 2002, PIKE remained fairly constant from 2003 until 2006, after which it increased until 2008. The results for 2009 indicate a decline. Sites with PIKE ranging from the lowest to the highest were identified. The results of the analysis provide a sound information base for scientific evidence-based decision making in the CITES process.
Resumo:
Surface-based GPS measurements of zenith path delay (ZPD) can be used to derive vertically integrated water vapor (IWV) of the atmosphere. ZPD data are collected in a global network presently consisting of 160 stations as part of the International GPS Service. In the present study, ZPD data from this network are converted into IWV using observed surface pressure and mean atmospheric water vapor column temperature obtained from the European Centre for Medium-Range Weather Forecasts' (ECMWF) operational analyses (OA). For the 4 months of January/July 2000/2001, the GPS-derived IWV values are compared to the IWV from the ECMWF OA, with a special focus on the monthly averaged difference (bias) and the standard deviation of daily differences. This comparison shows that the GPS-derived IWV values are well suited for the validation of OA of IWV. For most GPS stations, the IWV data agree quite well with the analyzed data indicating that they are both correct at these locations. Larger differences for individual days are interpreted as errors in the analyses. A dry bias in the winter is found over central United States, Canada, and central Siberia, suggesting a systematic analysis error. Larger differences were mainly found in mountain areas. These were related to representation problems and interpolation difficulties between model height and station height. In addition, the IWV comparison can be used to identify errors or problems in the observations of ZPD. This includes errors in the data itself, e.g., erroneous outlier in the measured time series, as well as systematic errors that affect all IWV values at a specific station. Such stations were excluded from the intercomparison. Finally, long-term requirements for a GPS-based water vapor monitoring system are discussed.
Resumo:
The sustainable intelligent building is a building that has the best combination of environmental, social, economic and technical values. And its sustainability assessment is related with system engineering methods and multi-criteria decision-making. Therefore firstly, the wireless monitoring system of sustainable parameters for intelligent buildings is achieved; secondly, the indicators and key issues based on the “whole life circle” for sustainability of intelligent buildings are researched; thirdly, the sustainable assessment model identified on the structure entropy and fuzzy analytic hierarchy process is proposed.
Resumo:
Land cover maps at different resolutions and mapping extents contribute to modeling and support decision making processes. Because land cover affects and is affected by climate change, it is listed among the 13 terrestrial essential climate variables. This paper describes the generation of a land cover map for Latin America and the Caribbean (LAC) for the year 2008. It was developed in the framework of the project Latin American Network for Monitoring and Studying of Natural Resources (SERENA), which has been developed within the GOFC-GOLD Latin American network of remote sensing and forest fires (RedLaTIF). The SERENA land cover map for LAC integrates: 1) the local expertise of SERENA network members to generate the training and validation data, 2) a methodology for land cover mapping based on decision trees using MODIS time series, and 3) class membership estimates to account for pixel heterogeneity issues. The discrete SERENA land cover product, derived from class memberships, yields an overall accuracy of 84% and includes an additional layer representing the estimated per-pixel confidence. The study demonstrates in detail the use of class memberships to better estimate the area of scarce classes with a scattered spatial distribution. The land cover map is already available as a printed wall map and will be released in digital format in the near future. The SERENA land cover map was produced with a legend and classification strategy similar to that used by the North American Land Change Monitoring System (NALCMS) to generate a land cover map of the North American continent, that will allow to combine both maps to generate consistent data across America facilitating continental monitoring and modeling
Resumo:
Land cover plays a key role in global to regional monitoring and modeling because it affects and is being affected by climate change and thus became one of the essential variables for climate change studies. National and international organizations require timely and accurate land cover information for reporting and management actions. The North American Land Change Monitoring System (NALCMS) is an international cooperation of organizations and entities of Canada, the United States, and Mexico to map land cover change of North America's changing environment. This paper presents the methodology to derive the land cover map of Mexico for the year 2005 which was integrated in the NALCMS continental map. Based on a time series of 250 m Moderate Resolution Imaging Spectroradiometer (MODIS) data and an extensive sample data base the complexity of the Mexican landscape required a specific approach to reflect land cover heterogeneity. To estimate the proportion of each land cover class for every pixel several decision tree classifications were combined to obtain class membership maps which were finally converted to a discrete map accompanied by a confidence estimate. The map yielded an overall accuracy of 82.5% (Kappa of 0.79) for pixels with at least 50% map confidence (71.3% of the data). An additional assessment with 780 randomly stratified samples and primary and alternative calls in the reference data to account for ambiguity indicated 83.4% overall accuracy (Kappa of 0.80). A high agreement of 83.6% for all pixels and 92.6% for pixels with a map confidence of more than 50% was found for the comparison between the land cover maps of 2005 and 2006. Further wall-to-wall comparisons to related land cover maps resulted in 56.6% agreement with the MODIS land cover product and a congruence of 49.5 with Globcover.
Resumo:
Wireless video sensor networks have been a hot topic in recent years; the monitoring capability is the central feature of the services offered by a wireless video sensor network can be classified into three major categories: monitoring, alerting, and information on-demand. These features have been applied to a large number of applications related to the environment (agriculture, water, forest and fire detection), military, buildings, health (elderly people and home monitoring), disaster relief, area and industrial monitoring. Security applications oriented toward critical infrastructures and disaster relief are very important applications that many countries have identified as critical in the near future. This paper aims to design a cross layer based protocol to provide the required quality of services for security related applications using wireless video sensor networks. Energy saving, delay and reliability for the delivered data are crucial in the proposed application. Simulation results show that the proposed cross layer based protocol offers a good performance in term of providing the required quality of services for the proposed application.
Resumo:
Medication safety and errors are a major concern in care homes. In addition to the identification of incidents, there is a need for a comprehensive system description to avoid the danger of introducing interventions that have unintended consequences and are therefore unsustainable. The aim of the study was to explore the impact and uniqueness of Work Domain Analysis (WDA) to facilitate an in-depth understanding of medication safety problems within the care home system and identify the potential benefits of WDA to design safety interventions to improve medication safety. A comprehensive, systematic and contextual overview of the care home medication system was developed for the first time. The novel use of the Abstraction Hierarchy (AH) to analyse medication errors revealed the value of the AH to guide a comprehensive analysis of errors and generate system improvement recommendations that took into account the contextual information of the wider system.
Resumo:
Wireless Sensor Networks (WSNs) have been an exciting topic in recent years. The services offered by a WSN can be classified into three major categories: monitoring, alerting, and information on demand. WSNs have been used for a variety of applications related to the environment (agriculture, water and forest fire detection), the military, buildings, health (elderly people and home monitoring), disaster relief, and area or industrial monitoring. In most WSNs tasks like processing the sensed data, making decisions and generating emergency messages are carried out by a remote server, hence the need for efficient means of transferring data across the network. Because of the range of applications and types of WSN there is a need for different kinds of MAC and routing protocols in order to guarantee delivery of data from the source nodes to the server (or sink). In order to minimize energy consumption and increase performance in areas such as reliability of data delivery, extensive research has been conducted and documented in the literature on designing energy efficient protocols for each individual layer. The most common way to conserve energy in WSNs involves using the MAC layer to put the transceiver and the processor of the sensor node into a low power, sleep state when they are not being used. Hence the energy wasted due to collisions, overhearing and idle listening is reduced. As a result of this strategy for saving energy, the routing protocols need new solutions that take into account the sleep state of some nodes, and which also enable the lifetime of the entire network to be increased by distributing energy usage between nodes over time. This could mean that a combined MAC and routing protocol could significantly improve WSNs because the interaction between the MAC and network layers lets nodes be active at the same time in order to deal with data transmission. In the research presented in this thesis, a cross-layer protocol based on MAC and routing protocols was designed in order to improve the capability of WSNs for a range of different applications. Simulation results, based on a range of realistic scenarios, show that these new protocols improve WSNs by reducing their energy consumption as well as enabling them to support mobile nodes, where necessary. A number of conference and journal papers have been published to disseminate these results for a range of applications.
Resumo:
The progress in wearable and implanted health monitoring technologies has strong potential to alter the future of healthcare services by enabling ubiquitous monitoring of patients. A typical health monitoring system consists of a network of wearable or implanted sensors that constantly monitor physiological parameters. Collected data are relayed using existing wireless communication protocols to the base station for additional processing. This article provides researchers with information to compare the existing low-power communication technologies that can potentially support the rapid development and deployment of WBAN systems, and mainly focuses on remote monitoring of elderly or chronically ill patients in residential environments.
Resumo:
Hybrid Photovoltaic Thermal (PVT) collectors are an emerging technology that combines PV and solar thermal systems in a single solar collector producing heat and electricity simultaneously. The focus of this thesis work is to evaluate the performance of unglazed open loop PVT air system integrated on a garage roof in Borlänge. As it is thought to have a significant potential for preheating ventilation of the building and improving the PV modules electrical efficiency. The performance evaluation is important to optimize the cooling strategy of the collector in order to enhance its electrical efficiency and maximize the production of thermal energy. The evaluation process involves monitoring the electrical and thermal energies for a certain period of time and investigating the cooling effect on the performance through controlling the air mass flow provided by a variable speed fan connected to the collector by an air distribution duct. The distribution duct transfers the heated outlet air from the collector to inside the building. The PVT air collector consists of 34 Solibro CIGS type PV modules (115 Wp for each module) which are roof integrated and have replaced the traditional roof material. The collector is oriented toward the south-west with a tilt of 29 ᵒ. The collector consists of 17 parallel air ducts formed between the PV modules and the insulated roof surface. Each air duct has a depth of 0.05 m, length of 2.38 m and width of 2.38 m. The air ducts are connected to each other through holes. The monitoring system is based on using T-type thermocouples to measure the relevant temperatures, air sensor to measure the air mass flow. These parameters are needed to calculate the thermal energy. The monitoring system contains also voltage dividers to measure the PV modules voltage and shunt resistance to measure the PV current, and AC energy meters which are needed to calculate the produced electrical energy. All signals recorded from the thermocouples, voltage dividers and shunt resistances are connected to data loggers. The strategy of cooling in this work was based on switching the fan on, only when the difference between the air duct temperature (under the middle of top of PV column) and the room temperature becomes higher than 5 °C. This strategy was effective in term of avoiding high electrical consumption by the fan, and it is recommended for further development. The temperature difference of 5 °C is the minimum value to compensate the heat losses in the collecting duct and distribution duct. The PVT air collector has an area of (Ac=32 m2), and air mass flow of 0.002 kg/s m2. The nominal output power of the collector is 4 kWppv (34 CIGS modules with 115 Wppvfor each module). The collector produces thermal output energy of 6.88 kWth/day (0.21 kWth/m2 day) and an electrical output energy of 13.46 kWhel/day (0.42 kWhel/m2 day) with cooling case. The PVT air collector has a daily thermal energy yield of 1.72 kWhth/kWppv, and a daily PV electrical energy yield of 3.36 kWhel /kWppv. The fan energy requirement in this case was 0.18 kWh/day which is very small compared to the electrical energy generated by the PV collector. The obtained thermal efficiency was 8 % which is small compared to the results reported in literature for PVT air collectors. The small thermal efficiency was due to small operating air mass flow. Therefore, the study suggests increasing the air mass flow by a factor of 25. The electrical efficiency was fluctuating around 14 %, which is higher than the theoretical efficiency of the PV modules, and this discrepancy was due to the poor method of recording the solar irradiance in the location. Due to shading effect, it was better to use more than one pyranometer.
Resumo:
Best corporate governance practices published in the primers of Brazilian Securities and Exchange Commission and the Brazilian Corporate Governance Institute promote board independence as much as possible, as a way to increase the effectiveness of governance mechanism (Sanzovo, 2010). Therefore, this paper aims at understanding if what the managerial literature portraits as being self-evident - stricter governance, better performance - can be observed in actual evidence. The question answered is: do companies with a stricter control and monitoring system perform better than others? The method applied in this paper consists on comparing 116 companies in respect to the their independence level between top management team and board directors– being that measured by four parameters, namely, the percentage of independent outsiders in the board, the separation of CEO and chairman, the adoption of contingent compensation and the percentage of institutional investors in the ownership structure – and their financial return measured in terms return on assets (ROA) from the latest Quarterly Earnings release of 2012. From the 534 companies listed in the Stock Exchange of Sao Paulo – Bovespa – 116 were selected due to their level of corporate governance. The title “Novo Mercado” refers to the superior level of governance level within companies listed in Bovespa, as they have to follow specific criteria to assure shareholders ´protection (BM&F, 2011). Regression analyses were conducted in order to reveal the correlation level between two selected variables. The results from the regression analysis were the following: the correlation between each parameter and ROA was 10.26%; the second regression analysis conducted measured the correlation between the independence level of top management team vis-à-vis board directors – namely, CEO relative power - and ROA, leading to a multiple R of 5.45%. Understanding that the scale is a simplification of the reality, the second part of the analysis transforms all the four parameters into dummy variables, excluding what could be called as an arbitrary scale. The ultimate result from this paper led to a multiple R of 28.44%, which implies that the combination of the variables are still not enough to translate the complex reality of organizations. Nonetheless, an important finding can be taken from this paper: two variables (percentage of outside directors and percentage of institutional investor ownership) are significant in the regression, with p-value lower than 10% and with negative coefficients. In other words, counter affirming what the literature very often portraits as being self-evident – stricter governance leads to higher performance – this paper has provided evidences to believe that the increase in the formal governance structure trough outside directors in the board and ownership by institutional investor might actually lead to worse performance. The section limitations and suggestions for future researches presents some reasons explaining why, although supported by strong theoretical background, this paper faced some challenging methodological assumptions, precluding categorical statements about the level of governance – measured by four selected parameters – and the financial return in terms of financial on assets.
Resumo:
O objetivo desta pesquisa é compreender de que forma as organizações moldam seu ambiente, analisando por que algumas práticas tornam-se reconhecidas como ‘sustentáveis’ na indústria de carne bovina brasileira. O estudo dialoga com a literatura de institucionalismo organizacional ao apontar a necessidade de considerar a política (i.e. as negociações entre atores) e significados, a fim de entender como a estabilidade e a mudança institucional ocorrem em um contexto situado (i.e. em um tempo e espaço específicos). A pesquisa conclui que os entendimentos sobre o que poderia ser reconhecido como ‘sustentabilidade’ são o resultado de atores moldando o seu ambiente por meio de ações e interações que produzem significados. Seguindo uma abordagem de hegemonia, essas disputas não são apenas entre os atores que procuram vantagens recursivas, mas também procuram defender ou atacar as lógicas sociais que apoiam a posição dominante dos atores. Além disso, os atores exercem sua agência sobre as condições no presente (i.e. contexto situado), com base em um passado herdado e com o objetivo de produzir um futuro que eles imaginam. Para analisar tais processos uma abordagem de hegemonia entre atores e lógicas sociais foi desenvolvida para destacar a ordem de negociação, uma arena em que os atores lutam pela hegemonia. Como resultado de tais negociações, uma questão focal emerge, influenciando o discurso e interesses dos atores, bem como justificando as iniciativas, programas e tecnologias sobre tal questão; construindo, portanto, o consenso. Baseando-se em Realismo Crítico e Análise Crítica do Discurso, a pesquisa desenvolveu um estudo de caso longitudinal suportado por documentos públicos e confidenciais e entrevistas com especialistas, para examinar o caminho da sustentabilidade na indústria de carne bovina brasileira. Identificou-se três contextos diferentes para agência em relação à sustentabilidade. Enquanto no primeiro verifica-se um silêncio sobre práticas de sustentabilidade, o segundo enfatiza a emergência do desmatamento da Amazônia como uma questão focal, devido à agência do Greenpeace e MPF que força a indústria a desenvolver um sistema de monitoramento que rastreie seus fornecedores de gado de modo a evitar compra de suprimentos associadas ao desmatamento da Amazônia, dentre outras atividades ilegais. Finalmente, durante o terceiro contexto, o sistema de monitoramento permite que indústria de carne bovina se aproprie da sustentabilidade, assim o setor da carne passa a construir a sua legitimidade para influenciar sobre os riscos e oportunidades associadas ao contexto da sustentabilidade. Em termos de lógicas sociais, o desmatamento na Amazônia foi denunciado como um problema ambiental, nesta indústria, ancorado em algumas características da lógica do capitalismo, como a gestão de riscos, inovação e aumento da produtividade, cadeia de fornecimento global e governança. Embora este ataque questione a racionalidade da maximização racional lucro, impondo restrições ambientais para o comportamento das empresas, a solução desenvolvida é também ancorada sobre as mesmas características do capitalismo empregadas para atacá-lo. Como consequência, uma mudança gradual é ilustrada por uma transformação na ‘eficiência quantitativa’ do capitalismo, o aumento da produtividade devido à mudança da proporção de recursos consumidos para produção e à preocupação em evitar o desmatamento da Amazônia. No entanto, a ‘eficiência qualitativa’ do capitalismo é preservada uma vez que os grupos dominantes no poder ainda estão controlando os meios de produção e os recursos a eles associados (i.e. dinheiro, poder e legitimidade). Uma vez que estes processos de negociações são mediados pela racionalidade de se evitar risco aos negócios, consequentemente, a maximização do lucro, o núcleo duro da lógica do capitalismo é preservado. Portanto, os grupos dominantes mantêm sua hegemonia.