968 resultados para multicommuted flow analysis
Resumo:
This study concerns the spatial allocation of material flows, with emphasis on construction material in the Irish housing sector. It addresses some of the key issues concerning anthropogenic impact on the environment through spatial temporal visualisation of the flow of materials, wastes and emissions at different spatial levels. This is presented in the form of a spatial model, Spatial Allocation of Material Flow Analysis (SAMFA), which enables the simulation of construction material flows and associated energy use. SAMFA parallels the Island Limits project (EPA funded under 2004-SD-MS-22-M2), which aimed to create a material flow analysis of the Irish economy classified by industrial sector. SAMFA further develops this by attempting to establish the material flows at the subnational geographical scale that could be used in the development of local authority (LA) sustainability strategies and spatial planning frameworks by highlighting the cumulative environmental impacts of the development of the built environment. By drawing on the idea of planning support systems, SAMFA also aims to provide a cross-disciplinary, integrative medium for involving stakeholders in strategies for a sustainable built environment and, as such, would help illustrate the sustainability consequences of alternative The pilot run of the model in Kildare has shown that the model can be successfully calibrated and applied to develop alternative material flows and energy-use scenarios at the ED level. This has been demonstrated through the development of an integrated and a business-as-usual scenario, with the former integrating a range of potential material efficiency and energysaving policy options and the latter replicating conditions that best describe the current trend. Their comparison shows that the former is better than the latter in terms of both material and energy use. This report also identifies a number of potential areas of future research and areas of broader application. This includes improving the accuracy of the SAMFA model (e.g. by establishing actual life expectancy of buildings in the Irish context through field surveys) and the extension of the model to other Irish counties. This would establish SAMFA as a valuable predicting and monitoring tool that is capable of integrating national and local spatial planning objectives with actual environmental impacts. Furthermore, should the model prove successful at this level, it then has the potential to transfer the modelling approach to other areas of the built environment, such as commercial development and other key contributors of greenhouse emissions. The ultimate aim is to develop a meta-model for predicting the consequences of consumption patterns at the local scale. This therefore offers the possibility of creating critical links between socio technical systems with the most important challenge of all the limitations of the biophysical environment.
Resumo:
Objective: Waveform analysis has been used to assess vascular resistance and predict cardiovascular events. We aimed to identify microvascular abnormalities in patients with impaired glucose tolerance (IGT) using ocular waveform analysis. The effects of pioglitazone were also assessed. Methods: Forty patients with IGT and twenty-four controls were studied. Doppler velocity recordings were obtained from the central retinal, ophthalmic and common carotid arteries, and sampled at 200 Hz. A discrete wavelet-based analysis method was employed to quantify waveforms. The resistive index (RI),was also determined. Patients with IGT were randomised to pioglitazone or placebo and measurements repeated after 12 weeks treatment. Results: In the ocular waveforms, significant differences in power spectra were observed in frequency band four (corresponding to frequencies between 6.25 and 12.50 Hz) between groups (p
Resumo:
This paper aims to contribute to the ongoing debate on the use of resource accounting tools in regional policy making. The Northern Limits project applied Material Flow Analysis and Ecological Footprinting to regional policy making in Northern Ireland over a number of years. The early phase of the research informed the regions first sustainable development strategy which was published in 2006 with key targets relating to the Ecological Footprint and improving the resource efficiency of the economy. Phase II identified the next steps required to address data availability and quality and the use of MFA and EF in providing a measurement and monitoring framework for the strategy and in the development of the strategy implementation plan. The use of MFA and Ecological Footprinting in sustainable regional policy making and the monitoring of its implementation is an ongoing process which has raised a number of research issues which can inform the ongoing application and development of these and other resource accounting tools to within Northern Ireland, provide insights for their use in other regions and help set out the priorities for research to support this important policy area.
Resumo:
Studies of urban metabolism provide important insights for environmental management of cities, but are not widely used in planning practice due to a mismatch of data scale and coverage. This paper introduces the Spatial Allocation of Material Flow Analysis (SAMFA) model as a potential decision support tool aimed as a contribution to overcome some of these difficulties and describes its pilot use at the county level in the Republic of Ireland. The results suggest that SAMFA is capable of identifying hotspots of higher material and energy use to support targeted planning initiatives, while its ability to visualise different policy scenarios supports more effective multi-stakeholder engagement. The paper evaluates this pilot use and sets out how this model can act as an analytical platform for the industrial ecology–spatial planning nexus.
Resumo:
The reduction of luvastatin (FLV) at a hanging mercury-drop electrode (HMDE) was studied by square-wave adsorptive-stripping voltammetry (SWAdSV). FLV can be accumulated and reduced at the electrode, with a maximum peak current intensity at a potential of approximately 1.26V vs. AgCl=Ag, in an aqueous electrolyte solution of pH 5.25. The method shows linearity between peak current intensity and FLV concentration between 1.0 10 8 and 2.7 10 6 mol L 1. Limits of detection (LOD) and quantification (LOQ) were found to be 9.9 10 9 mol L 1 and 3.3 10 8 mol L 1, respectively. Furthermore, FLV oxidation at a glassy carbon electrode surface was used for its hydrodynamic monitoring by amperometric detection in a flow-injection system. The amperometric signal was linear with FLV concentration over the range 1.0 10 6 to 1.0 10 5 mol L 1, with an LOD of 2.4 10 7 mol L 1 and an LOQ of 8.0 10 7 mol L 1. A sample rate of 50 injections per hour was achieved. Both methods were validated and showed to be precise and accurate, being satisfactorily applied to the determination of FLV in a commercial pharmaceutical.
Resumo:
Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.
Resumo:
We apply to the Senegalese input-output matrix of 1990, disagregated into formal and informal activities, a recently designed structural analytical method (Minimal-Flow-Analysis) which permits to depict the direct and indirect production likanges existing between activities.
Resumo:
The purpose of this paper is to present the application of a three-phase harmonic propagation analysis time-domain tool, using the Norton model to approach the modeling of non-linear loads, making the harmonics currents flow more appropriate to the operation analysis and to the influence of mitigation elements analysis. This software makes it possible to obtain results closer to the real distribution network, considering voltages unbalances, currents imbalances and the application of mitigation elements for harmonic distortions. In this scenario, a real case study with network data and equipments connected to the network will be presented, as well as the modeling of non-linear loads based on real data obtained from some PCCs (Points of Common Coupling) of interests for a distribution company.
Resumo:
An extensive sample (2%) of private vehicles in Italy are equipped with a GPS device that periodically measures their position and dynamical state for insurance purposes. Having access to this type of data allows to develop theoretical and practical applications of great interest: the real-time reconstruction of traffic state in a certain region, the development of accurate models of vehicle dynamics, the study of the cognitive dynamics of drivers. In order for these applications to be possible, we first need to develop the ability to reconstruct the paths taken by vehicles on the road network from the raw GPS data. In fact, these data are affected by positioning errors and they are often very distanced from each other (~2 Km). For these reasons, the task of path identification is not straightforward. This thesis describes the approach we followed to reliably identify vehicle paths from this kind of low-sampling data. The problem of matching data with roads is solved with a bayesian approach of maximum likelihood. While the identification of the path taken between two consecutive GPS measures is performed with a specifically developed optimal routing algorithm, based on A* algorithm. The procedure was applied on an off-line urban data sample and proved to be robust and accurate. Future developments will extend the procedure to real-time execution and nation-wide coverage.
Resumo:
A large body of research analyzes the runtime execution of a system to extract abstract behavioral views. Those approaches primarily analyze control flow by tracing method execution events or they analyze object graphs of heap snapshots. However, they do not capture how objects are passed through the system at runtime. We refer to the exchange of objects as the object flow, and we claim that object flow is necessary to analyze if we are to understand the runtime of an object-oriented application. We propose and detail Object Flow Analysis, a novel dynamic analysis technique that takes this new information into account. To evaluate its usefulness, we present a visual approach that allows a developer to study classes and components in terms of how they exchange objects at runtime. We illustrate our approach on three case studies.