984 resultados para indoor management rule


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent 'Global Burden of Disease' studies have provided quantitative evidence of the significant role air pollution plays as a human health risk factor (Lim et al., The Lancet, 380: 2224–2260, 2012). Tobacco smoke, including second hand smoke, household air pollution from solid fuels and ambient particulate matter are among the top risks, leading to lower life expectancy around the world. Indoor air constitutes an environment particularly rich in different types of pollutants, originating from indoor sources, as well as penetrating from outdoors, mixing, interacting or growing (when considering microbes) under the protective enclosure of the building envelope. Therefore, it is not a simple task to follow the dynamics of the processes occurring there, or to quantify the outcomes of the processes in terms of pollutant concentrations and other characteristics. This is further complicated by limitations such as building access for the purpose of air quality monitoring, or the instrumentation which can be used indoors, because of their possible interference with the occupants comfort (due to their large size, noise generated or amount of air drawn). European studies apportioned contributions of indoor versus outdoor sources of indoor air contaminants in 26 European countries and quantified IAQ associated DALYs (Disability-Adjusted Life Years) in those countries (Jantunen et al., Promoting actions for healthy indoor air (IAIAQ), European Commission Directorate General for Health and Consumers, Luxembourg, 2011). At the same time, there has been an increase in research efforts around the world to better understand the sources, composition, dynamics and impacts of indoor air pollution. Particular focus has been directed towards the contemporary sources, novel pollutants and new detection methods. The importance of exposure assessment and personal exposure, the majority of which occurs in various indoor micro¬environments, has also been realized. Overall, this emerging knowledge has been providing input for global assessments of indoor environments, the impact of indoor pollutants and their science based management and control. It was a major outcome of recent international conferences that interdisciplinarity and especially a better colla¬boration between exposure and indoor sciences would be of high benefit for the health related evaluation of environmental stress factors and pollutants. A very good example is the combination of biomonitoring and indoor air, particle and dust analysis to study the exposure routes of semi volatile organic compounds (SVOCs). We have adopted the idea of combining the forces of exposure and indoor sciences for this Special Issue, identified new and challenging topics and have attracted colleagues who are top researchers in their field to provide their inputs. The Special Issue includes papers, which collectively present advances in current research topics and in our view, build the bridge between indoor and exposure sciences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In school environments, children are constantly exposed to mixtures of airborne substances, derived from a variety of sources, both in the classroom and in the school surroundings. It is important to evaluate the hazardous properties of these mixtures, in order to conduct risk assessments of their impact on chil¬dren’s health. Within this context, through the application of a Maximum Cumulative Ratio approach, this study aimed to explore whether health risks due to indoor air mixtures are driven by a single substance or are due to cumulative exposure to various substances. This methodology requires knowledge of the concentration of substances in the air mixture, together with a health related weighting factor (i.e. reference concentration or lowest concentration of interest), which is necessary to calculate the Hazard Index. Maximum cumulative ratio and Hazard Index values were then used to categorise the mixtures into four groups, based on their hazard potential and therefore, appropriate risk management strategies. Air samples were collected from classrooms in 25 primary schools in Brisbane, Australia. Analysis was conducted based on the measured concentration of these substances in about 300 air samples. The results showed that in 92% of the schools, indoor air mixtures belonged to the ‘low concern’ group and therefore, they did not require any further assessment. In the remaining schools, toxicity was mainly governed by a single substance, with a very small number of schools having a multiple substance mix which required a combined risk assessment. The proposed approach enables the identification of such schools and thus, aides in the efficient health risk management of pollution emissions and air quality in the school environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Major infrastructure and construction (MIC) projects are those with significant traffic or environmental impact, of strategic and regional significance and high sensitivity. The decision making process of schemes of this type is becoming ever more complicated, especially with the increasing number of stakeholders involved and their growing tendency to defend their own varied interests. Failing to address and meet the concerns and expectations of stakeholders may result in project failures. To avoid this necessitates a systematic participatory approach to facilitate decision-making. Though numerous decision models have been established in previous studies (e.g. ELECTRE methods, the analytic hierarchy process and analytic network process) their applicability in the decision process during stakeholder participation in contemporary MIC projects is still uncertain. To resolve this, the decision rule approach is employed for modeling multi-stakeholder multi-objective project decisions. Through this, the result is obtained naturally according to the “rules” accepted by any stakeholder involved. In this sense, consensus is more likely to be achieved since the process is more convincing and the result is easier to be accepted by all concerned. Appropriate “rules”, comprehensive enough to address multiple objectives while straightforward enough to be understood by multiple stakeholders, are set for resolving conflict and facilitating consensus during the project decision process. The West Kowloon Cultural District (WKCD) project is used as a demonstration case and a focus group meeting is conducted in order to confirm the validity of the model established. The results indicate that the model is objective, reliable and practical enough to cope with real world problems. Finally, a suggested future research agenda is provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing focus of relationship marketing and customer relationship management (CRM) studies on issues of customer profitability has led to the emergence of an area of research on profitable customer management. Nevertheless, there is a notable lack of empirical research examining the current practices of firms specifically with regard to the profitable management of customer relationships according to the approaches suggested in theory. This thesis fills this research gap by exploring profitable customer management in the retail banking sector. Several topics are covered, including marketing metrics and accountability; challenges in the implementation of profitable customer management approaches in practice; analytic versus heuristic (‘rule of thumb’) decision making; and the modification of costly customer behavior in order to increase customer profitability, customer lifetime value (CLV), and customer equity, i.e. the financial value of the customer base. The thesis critically reviews the concept of customer equity and proposes a Customer Equity Scorecard, providing a starting point for a constructive dialog between marketing and finance concerning the development of appropriate metrics to measure marketing outcomes. Since customer management and measurement issues go hand in hand, profitable customer management is contingent on both marketing management skills and financial measurement skills. A clear gap between marketing theory and practice regarding profitable customer management is also identified. The findings show that key customer management aspects that have been proposed within the literature on profitable customer management for many years, are not being actively applied by the banks included in the research. Instead, several areas of customer management decision making are found to be influenced by heuristics. This dilemma for marketing accountability is addressed by emphasizing that CLV and customer equity, which are aggregate metrics, only provide certain indications regarding the relative value of customers and the approximate value of the customer base (or groups of customers), respectively. The value created by marketing manifests itself in the effect of marketing actions on customer perceptions, behavior, and ultimately the components of CLV, namely revenues, costs, risk, and retention, as well as additional components of customer equity, such as customer acquisition. The thesis also points out that although costs are a crucial component of CLV, they have largely been neglected in prior CRM research. Cost-cutting has often been viewed negatively in customer-focused marketing literature on service quality and customer profitability, but the case studies in this thesis demonstrate that reduced costs do not necessarily have to lead to lower service quality, customer retention, and customer-related revenues. Consequently, this thesis provides an expanded foundation upon which marketers can stake their claim for accountability. By focusing on the range of drivers and all of the components of CLV and customer equity, marketing has the potential to provide specific evidence concerning how various activities have affected the drivers and components of CLV within different groups of customers, and the implications for customer equity on a customer base level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The factors affecting the non-industrial, private forest landowners' (hereafter referred to using the acronym NIPF) strategic decisions in management planning are studied. A genetic algorithm is used to induce a set of rules predicting potential cut of the landowners' choices of preferred timber management strategies. The rules are based on variables describing the characteristics of the landowners and their forest holdings. The predictive ability of a genetic algorithm is compared to linear regression analysis using identical data sets. The data are cross-validated seven times applying both genetic algorithm and regression analyses in order to examine the data-sensitivity and robustness of the generated models. The optimal rule set derived from genetic algorithm analyses included the following variables: mean initial volume, landowner's positive price expectations for the next eight years, landowner being classified as farmer, and preference for the recreational use of forest property. When tested with previously unseen test data, the optimal rule set resulted in a relative root mean square error of 0.40. In the regression analyses, the optimal regression equation consisted of the following variables: mean initial volume, proportion of forestry income, intention to cut extensively in future, and positive price expectations for the next two years. The R2 of the optimal regression equation was 0.34 and the relative root mean square error obtained from the test data was 0.38. In both models, mean initial volume and positive stumpage price expectations were entered as significant predictors of potential cut of preferred timber management strategy. When tested with the complete data set of 201 observations, both the optimal rule set and the optimal regression model achieved the same level of accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers the problem of power management and throughput maximization for energy neutral operation when using Energy Harvesting Sensors (EHS) to send data over wireless links. It is assumed that the EHS are designed to transmit data at a constant rate (using a fixed modulation and coding scheme) but are power-controlled. A framework under which the system designer can optimize the performance of EHS when the channel is Rayleigh fading is developed. For example, the highest average data rate that can be supported over a Rayleigh fading channel given the energy harvesting capability, the battery power storage efficiency and the maximum allowed transmit energy per slot is derived. Furthermore, the optimum transmission scheme that guarantees a particular data throughput is derived. The usefulness of the framework developed is illustrated through simulation results for specific examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective network overload alleviation is very much essential in order to maintain security and integrity from the operational viewpoint of deregulated power systems. This paper aims at developing a methodology to reschedule the active power generation from the sources in order to manage the network congestion under normal/contingency conditions. An effective method has been proposed using fuzzy rule based inference system. Using virtual flows concept, which provides partial contributions/counter flows in the network elements is used as a basis in the proposed method to manage network congestions to the possible extent. The proposed method is illustrated on a sample 6 bus test system and on modified IEEE 39 bus system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we analyze how to evaluate fishery resource management under “ecological uncertainty”. In this context, an efficient policy consists of applying a different exploitation rule depending on the state of the resource and we could say that the stock is always in transition, jumping from one steady state to another. First, we propose a method for calibrating the growth path of the resource such that observed dynamics of resource and captures are matched. Second, we apply the calibration procedure proposed in two different fishing grounds: the European Anchovy (Division VIII) and the Southern Stock of Hake. Our results show that the role played by uncertainty is essential for the conclusions. For European Anchovy fishery (Division VIII) we find, in contrast with Del Valle et al. (2001), that this is not an overexploited fishing ground. However, we show that the Southern Stock of Hake is in a dangerous situation. In both cases our results are in accordance with ICES advice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Management of coastal development in Hawaii is based on the location of the certified shoreline, which is representative of the upper limit of marine inundation within the last several years. Though the certified shoreline location is significantly more variable than long-term erosion indicators, its migration will still follow the coastline's general trend. The long-term migration of Hawaii’s coasts will be significantly controlled by rising sea level. However, land use decisions adjacent to the shoreline and the shape and nature of the nearshore environment are also important controls to coastal migration. Though each of the islands has experienced local sea-level rise over the course of the last century, there are still locations across the islands of Kauai, Oahu, and Maui, which show long- term accretion or anomalously high erosion rates relative to their regions. As a result, engineering rules of thumb such as the Brunn rule do not always predict coastal migration and beach profile equilibrium in Hawaii. With coastlines facing all points of the compass rose, anthropogenic alteration of the coasts, complex coastal environments such as coral reefs, and the limited capacity to predict coastal change, Hawaii will require a more robust suite of proactive coastal management policies to weather future changes to its coastline. Continuing to use the current certified shoreline, adopting more stringent coastal setback rules similar to Kauai County, adding realistic sea-level rise components for all types of coastal planning, and developing regional beach management plans are some of the recommended adaptation strategies for Hawaii. (PDF contains 4 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Political drivers such as the Kyoto protocol, the EU Energy Performance of Buildings Directive and the Energy end use and Services Directive have been implemented in response to an identified need for a reduction in human related CO2 emissions. Buildings account for a significant portion of global CO2 emissions, approximately 25-30%, and it is widely acknowledged by industry and research organisations that they operate inefficiently. In parallel, unsatisfactory indoor environmental conditions have proven to negatively impact occupant productivity. Legislative drivers and client education are seen as the key motivating factors for an improvement in the holistic environmental and energy performance of a building. A symbiotic relationship exists between building indoor environmental conditions and building energy consumption. However traditional Building Management Systems and Energy Management Systems treat these separately. Conventional performance analysis compares building energy consumption with a previously recorded value or with the consumption of a similar building and does not recognise the fact that all buildings are unique. Therefore what is required is a new framework which incorporates performance comparison against a theoretical building specific ideal benchmark. Traditionally Energy Managers, who work at the operational level of organisations with respect to building performance, do not have access to ideal performance benchmark information and as a result cannot optimally operate buildings. This thesis systematically defines Holistic Environmental and Energy Management and specifies the Scenario Modelling Technique which in turn uses an ideal performance benchmark. The holistic technique uses quantified expressions of building performance and by doing so enables the profiled Energy Manager to visualise his actions and the downstream consequences of his actions in the context of overall building operation. The Ideal Building Framework facilitates the use of this technique by acting as a Building Life Cycle (BLC) data repository through which ideal building performance benchmarks are systematically structured and stored in parallel with actual performance data. The Ideal Building Framework utilises transformed data in the form of the Ideal Set of Performance Objectives and Metrics which are capable of defining the performance of any building at any stage of the BLC. It is proposed that the union of Scenario Models for an individual building would result in a building specific Combination of Performance Metrics which would in turn be stored in the BLC data repository. The Ideal Data Set underpins the Ideal Set of Performance Objectives and Metrics and is the set of measurements required to monitor the performance of the Ideal Building. A Model View describes the unique building specific data relevant to a particular project stakeholder. The energy management data and information exchange requirements that underlie a Model View implementation are detailed and incorporate traditional and proposed energy management. This thesis also specifies the Model View Methodology which complements the Ideal Building Framework. The developed Model View and Rule Set methodology process utilises stakeholder specific rule sets to define stakeholder pertinent environmental and energy performance data. This generic process further enables each stakeholder to define the resolution of data desired. For example, basic, intermediate or detailed. The Model View methodology is applicable for all project stakeholders, each requiring its own customised rule set. Two rule sets are defined in detail, the Energy Manager rule set and the LEED Accreditor rule set. This particular measurement generation process accompanied by defined View would filter and expedite data access for all stakeholders involved in building performance. Information presentation is critical for effective use of the data provided by the Ideal Building Framework and the Energy Management View definition. The specifications for a customised Information Delivery Tool account for the established profile of Energy Managers and best practice user interface design. Components of the developed tool could also be used by Facility Managers working at the tactical and strategic levels of organisations. Informed decision making is made possible through specified decision assistance processes which incorporate the Scenario Modelling and Benchmarking techniques, the Ideal Building Framework, the Energy Manager Model View, the Information Delivery Tool and the established profile of Energy Managers. The Model View and Rule Set Methodology is effectively demonstrated on an appropriate mixed use existing ‘green’ building, the Environmental Research Institute at University College Cork, using the Energy Management and LEED rule sets. Informed Decision Making is also demonstrated using a prototype scenario for the demonstration building.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Comfort is, in essence, satisfaction with the environment, and with respect to the indoor environment it is primarily satisfaction with the thermal conditions and air quality. Improving comfort has social, health and economic benefits, and is more financially significant than any other building cost. Despite this, comfort is not strictly managed throughout the building lifecycle. This is mainly due to the lack of an appropriate system to adequately manage comfort knowledge through the construction process into operation. Previous proposals to improve knowledge management have not been successfully adopted by the construction industry. To address this, the BabySteps approach was devised. BabySteps is an approach, proposed by this research, which states that for an innovation to be adopted into the industry it must be implementable through a number of small changes. This research proposes that improving the management of comfort knowledge will improve comfort. ComMet is a new methodology proposed by this research that manages comfort knowledge. It enables comfort knowledge to be captured, stored and accessed throughout the building life-cycle and so allowing it to be re-used in future stages of the building project and in future projects. It does this using the following: Comfort Performances – These are simplified numerical representations of the comfort of the indoor environment. Comfort Performances quantify the comfort at each stage of the building life-cycle using standard comfort metrics. Comfort Ratings - These are a means of classifying the comfort conditions of the indoor environment according to an appropriate standard. Comfort Ratings are generated by comparing different Comfort Performances. Comfort Ratings provide additional information relating to the comfort conditions of the indoor environment, which is not readily determined from the individual Comfort Performances. Comfort History – This is a continuous descriptive record of the comfort throughout the project, with a focus on documenting the items and activities, proposed and implemented, which could potentially affect comfort. Each aspect of the Comfort History is linked to the relevant comfort entity it references. These three components create a comprehensive record of the comfort throughout the building lifecycle. They are then stored and made available in a common format in a central location which allows them to be re-used ad infinitum. The LCMS System was developed to implement the ComMet methodology. It uses current and emerging technologies to capture, store and allow easy access to comfort knowledge as specified by ComMet. LCMS is an IT system that is a combination of the following six components: Building Standards; Modelling & Simulation; Physical Measurement through the specially developed Egg-Whisk (Wireless Sensor) Network; Data Manipulation; Information Recording; Knowledge Storage and Access.Results from a test case application of the LCMS system - an existing office room at a research facility - highlighted that while some aspects of comfort were being maintained, the building’s environment was not in compliance with the acceptable levels as stipulated by the relevant building standards. The implementation of ComMet, through LCMS, demonstrates how comfort, typically only considered during early design, can be measured and managed appropriately through systematic application of the methodology as means of ensuring a healthy internal environment in the building.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The abundance of many commercially important fish stocks are declining and this has led to widespread concern on the performance of traditional approach in fisheries management. Quantitative models are used for obtaining estimates of population abundance and the management advice is based on annual harvest levels (TAC), where only a certain amount of catch is allowed from specific fish stocks. However, these models are data intensive and less useful when stocks have limited historical information. This study examined whether empirical stock indicators can be used to manage fisheries. The relationship between indicators and the underlying stock abundance is not direct and hence can be affected by disturbances that may account for both transient and persistent effects. Methods from Statistical Process Control (SPC) theory such as the Cumulative Sum (CUSUM) control charts are useful in classifying these effects and hence they can be used to trigger management response only when a significant impact occurs to the stock biomass. This thesis explores how empirical indicators along with CUSUM can be used for monitoring, assessment and management of fish stocks. I begin my thesis by exploring various age based catch indicators, to identify those which are potentially useful in tracking the state of fish stocks. The sensitivity and response of these indicators towards changes in Spawning Stock Biomass (SSB) showed that indicators based on age groups that are fully selected to the fishing gear or Large Fish Indicators (LFIs) are most useful and robust across the range of scenarios considered. The Decision-Interval (DI-CUSUM) and Self-Starting (SS-CUSUM) forms are the two types of control charts used in this study. In contrast to the DI-CUSUM, the SS-CUSUM can be initiated without specifying a target reference point (‘control mean’) to detect out-of-control (significant impact) situations. The sensitivity and specificity of SS-CUSUM showed that the performances are robust when LFIs are used. Once an out-of-control situation is detected, the next step is to determine how much shift has occurred in the underlying stock biomass. If an estimate of this shift is available, they can be used to update TAC by incorporation into Harvest Control Rules (HCRs). Various methods from Engineering Process Control (EPC) theory were tested to determine which method can measure the shift size in stock biomass with the highest accuracy. Results showed that methods based on Grubb’s harmonic rule gave reliable shift size estimates. The accuracy of these estimates can be improved by monitoring a combined indicator metric of stock-recruitment and LFI because this may account for impacts independent of fishing. The procedure of integrating both SPC and EPC is known as Statistical Process Adjustment (SPA). A HCR based on SPA was designed for DI-CUSUM and the scheme was successful in bringing out-of-control fish stocks back to its in-control state. The HCR was also tested using SS-CUSUM in the context of data poor fish stocks. Results showed that the scheme will be useful for sustaining the initial in-control state of the fish stock until more observations become available for quantitative assessments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Community-based management and the establishment of marine reserves have been advocated worldwide as means to overcome overexploitation of fisheries. Yet, researchers and managers are divided regarding the effectiveness of these measures. The "tragedy of the commons" model is often accepted as a universal paradigm, which assumes that unless managed by the State or privatized, common-pool resources are inevitably overexploited due to conflicts between the self-interest of individuals and the goals of a group as a whole. Under this paradigm, the emergence and maintenance of effective community-based efforts that include cooperative risky decisions as the establishment of marine reserves could not occur. In this paper, we question these assumptions and show that outcomes of commons dilemmas can be complex and scale-dependent. We studied the evolution and effectiveness of a community-based management effort to establish, monitor, and enforce a marine reserve network in the Gulf of California, Mexico. Our findings build on social and ecological research before (1997-2001), during (2002) and after (2003-2004) the establishment of marine reserves, which included participant observation in >100 fishing trips and meetings, interviews, as well as fishery dependent and independent monitoring. We found that locally crafted and enforced harvesting rules led to a rapid increase in resource abundance. Nevertheless, news about this increase spread quickly at a regional scale, resulting in poaching from outsiders and a subsequent rapid cascading effect on fishing resources and locally-designed rule compliance. We show that cooperation for management of common-pool fisheries, in which marine reserves form a core component of the system, can emerge, evolve rapidly, and be effective at a local scale even in recently organized fisheries. Stakeholder participation in monitoring, where there is a rapid feedback of the systems response, can play a key role in reinforcing cooperation. However, without cross-scale linkages with higher levels of governance, increase of local fishery stocks may attract outsiders who, if not restricted, will overharvest and threaten local governance. Fishers and fishing communities require incentives to maintain their management efforts. Rewarding local effective management with formal cross-scale governance recognition and support can generate these incentives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a knowledge-based approach is proposed for the management of temporal information in process control. A common-sense theory of temporal constraints over processes/events, allowing relative temporal knowledge, is employed here as the temporal basis for the system. This theory supports duration reasoning and consistency checking, and accepts relative temporal knowledge which is in a form normally used by human operators. An architecture for process control is proposed which centres on an historical database consisting of events and processes, together with the qualitative temporal relationships between their occurrences. The dynamics of the system is expressed by means of three types of rule: database updating rules, process control rules, and data deletion rules. An example is provided in the form of a life scheduler, to illustrate the database and the rule sets. The example demonstrates the transitions of the database over time, and identifies the procedure in terms of a state transition model for the application. The dividing instant problem for logical inference is discussed with reference to this process control example, and it is shown how the temporal theory employed can be used to deal with the problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The theory of New Public Management (NPM) suggest that one of the features of advanced liberal rule is the tendency to define social, economic and political issues as problems to be solved through management. This paper argues that the restructuring of Higher Education (HE) in many Western countries since the 1980s has involved a shift from an emphasis on administration and policy to one of its efficient management. Utilising Foucault’s concept of governmentality rather than the liberal discourse of management as a politically neutral technology, managerialism can be seen as a newly emergent and increasingly rationalised disciplinary regime of governmentalising practices in advanced liberalism. As such the contemporary University as an institution governed by NPM can be demonstrated to have emerged not as the direct outcome of democratic policies that have rationalised its activities (so that the emancipatory aims of personal development, an educated workforce and of true research can be fully realised), nor can it be understood as the instrument through which individuals or self-realising classes are defeated through the calculations of the state acting on behalf of economic interests, rather it can be seen as the contingent and intractable outcome of the complex power/knowledge relations of advanced liberalism. I analyse the interlocking of the ‘tutor-subject’ and ‘student-subject’ as a local enacting of policy discourse informed by the NPM of HE that reshapes subjectivity and retunes the relationship between tutor and student. I put forward suggestions for how resistance to these new modes of disciplinary subjectification can be enacted.