865 resultados para Filmic approach methods
Resumo:
Friction plays a key role in causing slipperiness as a low coefficient of friction on the road may result in slippery and hazardous conditions. Analyzing the strong relation between friction and accident risk on winter roads is a difficult task. Many weather forecasting organizations use a variety of standard and bespoke methods to predict the coefficient of friction on roads. This article proposes an approach to predict the extent of slipperiness by building and testing an expert system. It estimates the coefficient of friction on winter roads in the province of Dalarna, Sweden using the prevailing weather conditions as a basis. Weather data from the road weather information system, Sweden (RWIS) was used. The focus of the project was to use the expert system as a part of a major project in VITSA, within the domain of intelligent transport systems
Resumo:
This paper uses examples from a Swedish study to suggest some ways in which cultural variation could be included in studies of thermal comfort. It is shown how only a slight shift of focus and methodological approach could help us discover aspects of human life that add to previous knowledge within comfort research of how human beings perceive and handle warmth and cold. It is concluded that it is not enough for buildings, heating systems and thermal control devices to be energy-efficient in a mere technical sense. If these are to help to decrease, rather than to increase, energy consumption, they have to support those parts of already existing habits and modes of thought that have the potential for low energy use. This is one reason why culture-specific features and emotional cores need to be investigated and deployed into the study and development of thermal comfort.
Resumo:
This thesis uses zonal travel cost method (ZTCM) to estimate consumer surplus of Peace & Love festival in Borlänge, Sweden. The study defines counties as zones of origin of the visitors. Visiting rates from each zone are estimated based on survey data. The study is novel due to the fact that mostly TCM has been applied in the environmental and recreational sector, not for short term events, like P&L festival. The analysis shows that travel cost has a significantly negative effect on visiting rate as expected. Even though income has previously shown to be significant in similar studies, it turns out to be insignificant in this study. A point estimate for the total consumer surplus of P&L festival is 35.6 million Swedish kronor. However, this point estimate is associated with high uncertainty since a 95 % confidence interval for it is (17.9, 53.2). It is also important to note that the estimated value only represents one part of the total economic value, the other values of the festival's totaleconomic value have not been estimated in this thesis.
Resumo:
Dynamic system test methods for heating systems were developed and applied by the institutes SERC and SP from Sweden, INES from France and SPF from Switzerland already before the MacSheep project started. These test methods followed the same principle: a complete heating system – including heat generators, storage, control etc., is installed on the test rig; the test rig software and hardware simulates and emulates the heat load for space heating and domestic hot water of a single family house, while the unit under test has to act autonomously to cover the heat demand during a representative test cycle. Within the work package 2 of the MacSheep project these similar – but different – test methods were harmonized and improved. The work undertaken includes: • Harmonization of the physical boundaries of the unit under test. • Harmonization of the boundary conditions of climate and load. • Definition of an approach to reach identical space heat load in combination with an autonomous control of the space heat distribution by the unit under test. • Derivation and validation of new six day and a twelve day test profiles for direct extrapolation of test results. The new harmonized test method combines the advantages of the different methods that existed before the MacSheep project. The new method is a benchmark test, which means that the load for space heating and domestic hot water preparation will be identical for all tested systems, and that the result is representative for the performance of the system over a whole year. Thus, no modelling and simulation of the tested system is needed in order to obtain the benchmark results for a yearly cycle. The method is thus also applicable to products for which simulation models are not available yet. Some of the advantages of the new whole system test method and performance rating compared to the testing and energy rating of single components are: • Interaction between the different components of a heating system, e.g. storage, solar collector circuit, heat pump, control, etc. are included and evaluated in this test. • Dynamic effects are included and influence the result just as they influence the annual performance in the field. • Heat losses are influencing the results in a more realistic way, since they are evaluated under "real installed" and representative part-load conditions rather than under single component steady state conditions. The described method is also suited for the development process of new systems, where it replaces time-consuming and costly field testing with the advantage of a higher accuracy of the measured data (compared to the typically used measurement equipment in field tests) and identical, thus comparable boundary conditions. Thus, the method can be used for system optimization in the test bench under realistic operative conditions, i.e. under relevant operating environment in the lab. This report describes the physical boundaries of the tested systems, as well as the test procedures and the requirements for both the unit under test and the test facility. The new six day and twelve day test profiles are also described as are the validation results.
Resumo:
Purpose: This paper aims to extend and contribute to prior research on the association between company characteristics and choice of capital budgeting methods (CBMs). Design/methodology/approach: A multivariate regression analysis on questionnaire data from 2005 and 2008 is used to study which factors determine the choice of CBMs in Swedish listed companies. Findings: Our results supported hypotheses that Swedish listed companies have become more sophisticated over the years (or at least less unsophisticated) which indicates a closing of the theory-practice gap; that companies with greater leverage used payback more often; and that companies with stricter debt targets and less management ownership employed accounting rate of return more frequent. Moreover, larger companies used CBMs more often. Originality/value: The paper contributes to prior research within this field by being the first Swedish study to examine the association between use of CBMs and as many as twelve independent variables, including changes over time, by using multivariate regression analysis. The results are compared to a US and a continental European study.
Resumo:
This paper is a preliminary investigation into the application of the formal-logical theory of normative positions to the characterisation of normative-informational positions, pertaining to rules that are meant to regulate the supply of information. First, we present the proposed framework. Next, we identify the kinds of nuances and distinctions that can be articulated in such a logical framework. Finally, we show how such nuances can arise in specific regulations. Reference is made to Data Protection Law and Contract Law, among others. The proposed approach is articulated around two essential steps. The first involves identifying the set of possible interpretations that can be given to a particular norm. This is done by using formal methods. The second involves picking out one of these interpretations as the most likely one. This second step can be resolved only by using further information (e.g., the context or other parts of the regulation).
Resumo:
Canada releases over 150 billion litres of untreated and undertreated wastewater into the water environment every year1. To clean up urban wastewater, new Federal Wastewater Systems Effluent Regulations (WSER) on establishing national baseline effluent quality standards that are achievable through secondary wastewater treatment were enacted on July 18, 2012. With respect to the wastewater from the combined sewer overflows (CSO), the Regulations require the municipalities to report the annual quantity and frequency of effluent discharges. The City of Toronto currently has about 300 CSO locations within an area of approximately 16,550 hectares. The total sewer length of the CSO area is about 3,450 km and the number of sewer manholes is about 51,100. A system-wide monitoring of all CSO locations has never been undertaken due to the cost and practicality. Instead, the City has relied on estimation methods and modelling approaches in the past to allow funds that would otherwise be used for monitoring to be applied to the reduction of the impacts of the CSOs. To fulfill the WSER requirements, the City is now undertaking a study in which GIS-based hydrologic and hydraulic modelling is the approach. Results show the usefulness of this for 1) determining the flows contributing to the combined sewer system in the local and trunk sewers for dry weather flow, wet weather flow, and snowmelt conditions; 2) assessing hydraulic grade line and surface water depth in all the local and trunk sewers under heavy rain events; 3) analysis of local and trunk sewer capacities for future growth; and 4) reporting of the annual quantity and frequency of CSOs as per the requirements in the new Regulations. This modelling approach has also allowed funds to be applied toward reducing and ultimately eliminating the adverse impacts of CSOs rather than expending resources on unnecessary and costly monitoring.
Resumo:
The purpose of this project is to understand, under a social constructionist approach, what are the meanings that external facilitators and organizational members (sponsors) working with dialogic methods place on themselves and their work. Dialogic methods, with the objective of engaging groups in flows of conversations to envisage and co-create their own future, are growing fast within organizations as a means to achieve collective change. Sharing constructionist ideas about the possibility of multiple realities and language as constitutive of such realities, dialogue has turned into a promising way for transformation, especially in a macro context of constant change and increasing complexity, where traditional structures, relationships and forms of work are questioned. Research on the topic has mostly focused on specific methods or applications, with few attempts to study it in a broader sense. Also, despite the fact that dialogic methods work on the assumption that realities are socially constructed, few studies approach the topic from a social constructionist perspective, as a research methodology per se. Thus, while most existing research aims at explaining whether or how particular methods meet particular results, my intention is to explore the meanings sustaining these new forms of organizational practice. Data was collected through semi-structured interviews with 25 people working with dialogic methods: 11 facilitators and 14 sponsors, from 8 different organizations in Brazil. Firstly, the research findings indicate several contextual elements that seem to sustain the choices for dialogic methods. Within this context, there does not seem to be a clear or specific demand for dialogic methods, but a set of different motivations, objectives and focuses, bringing about several contrasts in the way participants name, describe and explain their experiences with such methods, including tensions on power relations, knowledge creation, identity and communication. Secondly, some central ideas or images were identified within such contrasts, pointing at both directions: dialogic methods as opportunities for the creation of new organizational realities (with images of a ‘door’ or a ‘flow’, for instance, which suggest that dialogic methods may open up the access to other perspectives and the creation of new realities); and dialogic methods as new instrumental mechanisms that seem to reproduce the traditional and non-dialogical forms of work and relationship. The individualistic tradition and its tendency for rational schematism - pointed out by social constructionist scholars as strong traditions in our Western Culture - could be observed in some participants’ accounts with the image of dialogic methods as a ‘gym’, for instance, in which dialogical – and idealized –‘abilities’ could be taught and trained, turning dialogue into a tool, rather than a means for transformation. As a conclusion, I discuss what the implications of such taken-for-granted assumptions may be, and offer some insights into dialogue (and dialogic methods) as ‘the art of being together’.
Resumo:
Researchers often rely on the t-statistic to make inference on parameters in statistical models. It is common practice to obtain critical values by simulation techniques. This paper proposes a novel numerical method to obtain an approximately similar test. This test rejects the null hypothesis when the test statistic islarger than a critical value function (CVF) of the data. We illustrate this procedure when regressors are highly persistent, a case in which commonly-used simulation methods encounter dificulties controlling size uniformly. Our approach works satisfactorily, controls size, and yields a test which outperforms the two other known similar tests.
Resumo:
This paper characterizes humic substances (HS) extracted from soil samples collected in the Rio Negro basin in the state of Amazonas, Brazil, particularly investigating their reduction capabilities towards Hg(II) in order to elucidate potential mercury cycling/volatilization in this environment. For this reason, a multimethod approach was used, consisting of both instrumental methods (elemental analysis, EPR, solid-state NMR, FIA combined with cold-vapor AAS of Hg(0)) and statistical methods such as principal component analysis (PCA) and a central composite factorial planning method. The HS under study were divided into groups, complexing and reducing ones, owing to different distribution of their functionalities. The main functionalities (cor)related with reduction of Hg(II) were phenolic, carboxylic and amide groups, while the groups related with complexation of Hg(II) were ethers, hydroxyls, aldehydes and ketones. The HS extracted from floodable regions of the Rio Negro basin presented a greater capacity to retain (to complex, to adsorb physically and/or chemically) Hg(II), while nonfloodable regions showed a greater capacity to reduce Hg(II), indicating that HS extracted from different types of regions contribute in different ways to the biogeochemical mercury cycle in the basin of the mid-Rio Negro, AM, Brazil. (c) 2007 Published by Elsevier B.V.
H-infinity control design for time-delay linear systems: a rational transfer function based approach
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Most face recognition approaches require a prior training where a given distribution of faces is assumed to further predict the identity of test faces. Such an approach may experience difficulty in identifying faces belonging to distributions different from the one provided during the training. A face recognition technique that performs well regardless of training is, therefore, interesting to consider as a basis of more sophisticated methods. In this work, the Census Transform is applied to describe the faces. Based on a scanning window which extracts local histograms of Census Features, we present a method that directly matches face samples. With this simple technique, 97.2% of the faces in the FERET fa/fb test were correctly recognized. Despite being an easy test set, we have found no other approaches in literature regarding straight comparisons of faces with such a performance. Also, a window for further improvement is presented. Among other techniques, we demonstrate how the use of SVMs over the Census Histogram representation can increase the recognition performance.