995 resultados para Normalized systems
Resumo:
Develop software is still a risky business. After 60 years of experience, this community is still not able to consistently build Information Systems (IS) for organizations with predictable quality, within previously agreed budget and time constraints. Although software is changeable we are still unable to cope with the amount and complexity of change that organizations demand for their IS. To improve results, developers followed two alternatives: Frameworks that increase productivity but constrain the flexibility of possible solutions; Agile ways of developing software that keep flexibility with less upfront commitments. With strict frameworks, specific hacks have to be put in place to get around the framework construction options. In time this leads to inconsistent architectures that are harder to maintain due to incomplete documentation and human resources turnover. The main goals of this work is to create a new way to develop flexible IS for organizations, using web technologies, in a faster, better and cheaper way that is more suited to handle organizational change. To do so we propose an adaptive object model that uses a new ontology for data and action with strict normalizing rules. These rules should bound the effects of changes that can be better tested and therefore corrected. Interfaces are built with templates of resources that can be reused and extended in a flexible way. The “state of the world” for each IS is determined by all production and coordination acts that agents performed over time, even those performed by external systems. When bugs are found during maintenance, their past cascading effects can be checked through simulation, re-running the log of transaction acts over time and checking results with previous records. This work implements a prototype with part of the proposed system in order to have a preliminary assessment its feasibility and limitations.
Resumo:
The mechanical properties of film-substrate systems have been investigated through nano-indentation experiments in our former paper (Chen, S.H., Liu, L., Wang, T.C., 2005. Investigation of the mechanical properties of thin films by nano-indentation, considering the effects of thickness and different coating-substrate combinations. Surf. Coat. Technol., 191, 25-32), in which Al-Glass with three different film thicknesses are adopted and it is found that the relation between the hardness H and normalized indentation depth h/t, where t denotes the film thickness, exhibits three different regimes: (i) the hardness decreases obviously with increasing indentation depth; (ii) then, the hardness keeps an almost constant value in the range of 0.1-0.7 of the normalized indentation depth h/t; (iii) after that, the hardness increases with increasing indentation depth. In this paper, the indentation image is further investigated and finite element method is used to analyze the nano-indentation phenomena with both classical plasticity and strain gradient plasticity theories. Not only the case with an ideal sharp indenter tip but also that with a round one is considered in both theories. Finally, we find that the classical plasticity theory can not predict the experimental results, even considering the indenter tip curvature. However, the strain gradient plasticity theory can describe the experimental data very well not only at a shallow indentation depth but also at a deep depth. Strain gradient and substrate effects are proved to coexist in film-substrate nano-indentation experiments. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
A two-stage H∞-based design procedure is described which uses a normalized coprime factor approach to robust stabilization of linear systems. A loop-shaping procedure is incroporated to allow the specification of performance characteristics. Theoretical justification of this technique and an outline of the design methodology are given.
Resumo:
Understanding how and why changes propagate during engineering design is critical because most products and systems emerge from predecessors and not through clean sheet design. This paper applies change propagation analysis methods and extends prior reasoning through examination of a large data set from industry including 41,500 change requests, spanning 8 years during the design of a complex sensor system. Different methods are used to analyze the data and the results are compared to each other and evaluated in the context of previous findings. In particular the networks of connected parent, child and sibling changes are resolved over time and mapped to 46 subsystem areas. A normalized change propagation index (CPI) is then developed, showing the relative strength of each area on the absorber-multiplier spectrum between -1 and +1. Multipliers send out more changes than they receive and are good candidates for more focused change management. Another interesting finding is the quantitative confirmation of the "ripple" change pattern. Unlike the earlier prediction, however, it was found that the peak of cyclical change activity occurred late in the program driven by systems integration and functional testing. Patterns emerged from the data and offer clear implications for technical change management approaches in system design. Copyright © 2007 by ASME.
Resumo:
Cross-layer techniques represent efficient means to enhance throughput and increase the transmission reliability of wireless communication systems. In this paper, a cross-layer design of aggressive adaptive modulation and coding (A-AMC), truncated automatic repeat request (T-ARQ), and user scheduling is proposed for multiuser multiple-input-multiple-output (MIMO) maximal ratio combining (MRC) systems, where the impacts of feedback delay (FD) and limited feedback (LF) on channel state information (CSI) are also considered. The A-AMC and T-ARQ mechanism selects the appropriate modulation and coding schemes (MCSs) to achieve higher spectral efficiency while satisfying the service requirement on the packet loss rate (PLR), profiting from the feasibility of using different MCSs to retransmit a packet, which is destined to a scheduled user selected to exploit multiuser diversity and enhance the system's performance in terms of both transmission efficiency and fairness. The system's performance is evaluated in terms of the average PLR, average spectral efficiency (ASE), outage probability, and average packet delay, which are derived in closed form, considering transmissions over Rayleigh-fading channels. Numerical results and comparisons are provided and show that A-AMC combined with T-ARQ yields higher spectral efficiency than the conventional scheme based on adaptive modulation and coding (AMC), while keeping the achieved PLR closer to the system's requirement and reducing delay. Furthermore, the effects of the number of ARQ retransmissions, numbers of transmit and receive antennas, normalized FD, and cardinality of the beamforming weight vector codebook are studied and discussed.
Detection and Identification of Abnormalities in Customer Consumptions in Power Distribution Systems
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The seismic behaviour of one-storey asymmetric structures has been studied since 1970s by a number of researches studies which identified the coupled nature of the translational-to-torsional response of those class of systems leading to severe displacement magnifications at the perimeter frames and therefore to significant increase of local peak seismic demand to the structural elements with respect to those of equivalent not-eccentric systems (Kan and Chopra 1987). These studies identified the fundamental parameters (such as the fundamental period TL normalized eccentricity e and the torsional-to-lateral frequency ratio Ωϑ) governing the torsional behavior of in-plan asymmetric structures and trends of behavior. It has been clearly recognized that asymmetric structures characterized by Ωϑ >1, referred to as torsionally-stiff systems, behave quite different form structures with Ωϑ <1, referred to as torsionally-flexible systems. Previous research works by some of the authors proposed a simple closed-form estimation of the maximum torsional response of one-storey elastic systems (Trombetti et al. 2005 and Palermo et al. 2010) leading to the so called “Alpha-method” for the evaluation of the displacement magnification factors at the corner sides. The present paper provides an upgrade of the “Alpha Method” removing the assumption of linear elastic response of the system. The main objective is to evaluate how the excursion of the structural elements in the inelastic field (due to the reaching of yield strength) affects the displacement demand of one-storey in-plan asymmetric structures. The system proposed by Chopra and Goel in 2007, which is claimed to be able to capture the main features of the non-linear response of in-plan asymmetric system, is used to perform a large parametric analysis varying all the fundamental parameters of the system, including the inelastic demand by varying the force reduction factor from 2 to 5. Magnification factors for different force reduction factor are proposed and comparisons with the results obtained from linear analysis are provided.
Resumo:
To reach the goals established by the Institute of Medicine (IOM) and the Centers for Disease Control's (CDC) STOP TB USA, measures must be taken to curtail a future peak in Tuberculosis (TB) incidence and speed the currently stagnant rate of TB elimination. Both efforts will require, at minimum, the consideration and understanding of the third dimension of TB transmission: the location-based spread of an airborne pathogen among persons known and unknown to each other. This consideration will require an elucidation of the areas within the U.S. that have endemic TB. The Houston Tuberculosis Initiative (HTI) was a population-based active surveillance of confirmed Houston/Harris County TB cases from 1995–2004. Strengths in this dataset include the molecular characterization of laboratory confirmed cases, the collection of geographic locations (including home addresses) frequented by cases, and the HTI time period that parallels a decline in TB incidence in the United States (U.S.). The HTI dataset was used in this secondary data analysis to implement a GIS analysis of TB cases, the locations frequented by cases, and their association with risk factors associated with TB transmission. ^ This study reports, for the first time, the incidence of TB among the homeless in Houston, Texas. The homeless are an at-risk population for TB disease, yet they are also a population whose TB incidence has been unknown and unreported due to their non-enumeration. The first section of this dissertation identifies local areas in Houston with endemic TB disease. Many Houston TB cases who reported living in these endemic areas also share the TB risk factor of current or recent homelessness. Merging the 2004–2005 Houston enumeration of the homeless with historical HTI surveillance data of TB cases in Houston enabled this first-time report of TB risk among the homeless in Houston. The homeless were more likely to be US-born, belong to a genotypic cluster, and belong to a cluster of a larger size. The calculated average incidence among homeless persons was 411/100,000, compared to 9.5/100,000 among housed. These alarming rates are not driven by a co-infection but by social determinants. The unsheltered persons were hospitalized more days and required more follow-up time by staff than those who reported a steady housing situation. The homeless are a specific example of the increased targeting of prevention dollars that could occur if TB rates were reported for specific areas with known health disparities rather than as a generalized rate normalized over a diverse population. ^ It has been estimated that 27% of Houstonians use public transportation. The city layout allows bus routes to run like veins connecting even the most diverse of populations within the metropolitan area. Secondary data analysis of frequent bus use (defined as riding a route weekly) among TB cases was assessed for its relationship with known TB risk factors. The spatial distribution of genotypic clusters associated with bus use was assessed, along with the reported routes and epidemiologic-links among cases belonging to the identified clusters. ^ TB cases who reported frequent bus use were more likely to have demographic and social risk factors associated with poverty, immune suppression and health disparities. An equal proportion of bus riders and non-bus riders were cultured for Mycobacterium tuberculosis, yet 75% of bus riders were genotypically clustered, indicating recent transmission, compared to 56% of non-bus riders (OR=2.4, 95%CI(2.0, 2.8), p<0.001). Bus riders had a mean cluster size of 50.14 vs. 28.9 (p<0.001). Second order spatial analysis of clustered fingerprint 2 (n=122), a Beijing family cluster, revealed geographic clustering among cases based on their report of bus use. Univariate and multivariate analysis of routes reported by cases belonging to these clusters found that 10 of the 14 clusters were associated with use. Individual Metro routes, including one route servicing the local hospitals, were found to be risk factors for belonging to a cluster shown to be endemic in Houston. The routes themselves geographically connect the census tracts previously identified as having endemic TB. 78% (15/23) of Houston Metro routes investigated had one or more print groups reporting frequent use for every HTI study year. We present data on three specific but clonally related print groups and show that bus-use is clustered in time by route and is the only known link between cases in one of the three prints: print 22. (Abstract shortened by UMI.)^
Resumo:
The tribology of linear tape storage system including Linear Tape Open (LTO) and Travan5 was investigated by combining X-ray Photoelectron Spectroscopy (XPS), Auger Electron Spectroscopy (AES), Optical Microscopy and Atomic Force Microscopy (AFM) technologies. The purpose of this study was to understand the tribology mechanism of linear tape systems then projected recording densities may be achieved in future systems. Water vapour pressure or Normalized Water Content (NWC) rather than the Relative Humidity (RH) values (as are used almost universally in this field) determined the extent of PTR and stain (if produced) in linear heads. Approximately linear dependencies were found for saturated PTR increasing with normalized water content increasing over the range studied using the same tape. Fe Stain (if produced) preferentially formed on the head surfaces at the lower water contents. The stain formation mechanism had been identified. Adhesive bond formation is a chemical process that is governed by temperature. Thus the higher the contact pressure, the higher the contact temperature in the interface of head and tape, was produced higher the probability of adhesive bond formation and the greater the amount of transferred material (stain). Water molecules at the interface saturate the surface bonds and makes adhesive junctions less likely. Tape polymeric binder formulation also has a significant role in stain formation, with the latest generation binders producing less transfer of material. This is almost certainly due to higher cohesive bonds within the body of the magnetic layer. TiC in the two-phase ceramic tape-bearing surface (AlTiC) was found to oxidise to form TiO2.The oxidation rate of TiC increased with water content increasing. The oxide was less dense than the underlying carbide; hence the interface between TiO2 oxide and TiC was stressed. Removals of the oxide phase results in the formation of three-body abrasive particles that were swept across the tape head, and gave rise to three-body abrasive wear, particularly in the pole regions. Hence, PTR and subsequent which signal loss and error growth. The lower contact pressure of the LTO system comparing with the Travan5 system ensures that fewer and smaller three-body abrasive particles were swept across the poles and insulator regions. Hence, lower contact pressure, as well as reducing stain in the same time significantly reduces PTR in the LTO system.
Resumo:
Desalination of brackish groundwater (BW) is an effective approach to augment water supply, especially for inland regions that are far from seawater resources. Brackish water reverse osmosis (BWRO) desalination is still subject to intensive energy consumption compared to the theoretical minimum energy demand. Here, we review some of the BWRO plants with various system arrangements. We look at how to minimize energy demands, as these contribute considerably to the cost of desalinated water. Different configurations of BWRO system have been compared from the view point of normalized specific energy consumption (SEC). Analysis is made at theoretical limits. The SEC reduction of BWRO can be achieved by (i) increasing number of stages, (ii) using an energy recovery device (ERD), or (iii) operating the BWRO in batch mode or closed circuit mode. Application of more stages not only reduces SEC but also improves water recovery. However, this improvement is less pronounced when the number of stages exceeds four. Alternatively and more favourably, the BWRO system can be operated in Closed Circuit Desalination (CCD) mode and gives a comparative SEC to that of the 3-stage system with a recovery ratio of 80%. A further reduction of about 30% in SEC can be achieved through batch-RO operation. Moreover, the costly ERDs and booster pumps are avoided with both CCD and batch-RO, thus furthering the effectiveness of lowering the costs of these innovative approaches. © 2012 by the authors.
Resumo:
Remote sensing is a promising approach for above ground biomass estimation, as forest parameters can be obtained indirectly. The analysis in space and time is quite straight forward due to the flexibility of the method to determine forest crown parameters with remote sensing. It can be used to evaluate and monitoring for example the development of a forest area in time and the impact of disturbances, such as silvicultural practices or deforestation. The vegetation indices, which condense data in a quantitative numeric manner, have been used to estimate several forest parameters, such as the volume, basal area and above ground biomass. The objective of this study was the development of allometric functions to estimate above ground biomass using vegetation indices as independent variables. The vegetation indices used were the Normalized Difference Vegetation Index (NDVI), Enhanced Vegetation Index (EVI), Simple Ratio (SR) and Soil-Adjusted Vegetation Index (SAVI). QuickBird satellite data, with 0.70 m of spatial resolution, was orthorectified, geometrically and atmospheric corrected, and the digital number were converted to top of atmosphere reflectance (ToA). Forest inventory data and published allometric functions at tree level were used to estimate above ground biomass per plot. Linear functions were fitted for the monospecies and multispecies stands of two evergreen oaks (Quercus suber and Quercus rotundifolia) in multiple use systems, montados. The allometric above ground biomass functions were fitted considering the mean and the median of each vegetation index per grid as independent variable. Species composition as a dummy variable was also considered as an independent variable. The linear functions with better performance are those with mean NDVI or mean SR as independent variable. Noteworthy is that the two better functions for monospecies cork oak stands have median NDVI or median SR as independent variable. When species composition dummy variables are included in the function (with stepwise regression) the best model has median NDVI as independent variable. The vegetation indices with the worse model performance were EVI and SAVI.