948 resultados para Constraints-led approach
Resumo:
Neste trabalho é proposto um fotômetro baseado em LED (diodo emissor de luz) para fotometria em fase sólida. O fotômetro foi desenvolvido para permitir o acoplamento da fonte de radiação (LED) e do fotodetector direto na cela de fluxo, tendo um caminho óptico de 4 mm. A cela de fluxo foi preenchida com material sólido (C18), o qual foi utilizado para imobilizar o reagente cromogênico 1-(2-tiazolilazo)-2-naftol (TAN). A exatidão foi avaliada empregando dados obtidos através da técnica ICP OES (espectrometria de emissão por plasma indutivamente acoplado). Aplicando-se o teste-t pareado não foi observada diferença significativa em nível de confiança de 95%. Outros parâmetros importantes encontrados foram faixa de resposta linear de 0,05 a 0,85 mg L-1 Zn, limite de detecção de 9 µg L-1 Zn (n = 3), desvio padrão de 1,4 % (n = 10), frequência de amostragem de 36 determinações por h, e uma geração de efluente e consumo de reagente de 1,7 mL e 0,03 µg por determinação, respectivamente.
Resumo:
Lake Baikal, the world's most voluminous freshwater lake, has experienced unprecedented warming during the last decades. A uniquely diverse amphipod fauna inhabits the littoral zone and can serve as a model system to identify the role of thermal tolerance under climate change. This study aimed to identify sublethal thermal constraints in two of the most abundant endemic Baikal amphipods, Eulimnogammarus verrucosus and Eulimnogammarus cyaneus, and Gammarus lacustris, a ubiquitous gammarid of the Holarctic. As the latter is only found in some shallow isolated bays of the lake, we further addressed the question whether rising temperatures could promote the widespread invasion of this non-endemic species into the littoral zone. Animals were exposed to gradual temperature increases (4 week, 0.8 °C/d; 24 h, 1 °C/h) starting from the reported annual mean temperature of the Baikal littoral (6 °C). Within the framework of oxygen- and capacity-limited thermal tolerance (OCLTT), we used a nonlinear regression approach to determine the points at which the changing temperature-dependence of relevant physiological processes indicates the onset of limitation. Limitations in ventilation representing the first limits of thermal tolerance (pejus (= "getting worse") temperatures (Tp)) were recorded at 10.6 (95% confidence interval; 9.5, 11.7), 19.1 (17.9, 20.2), and 21.1 (19.8, 22.4) °C in E. verrucosus, E. cyaneus, and G. lacustris, respectively. Field observations revealed that E. verrucosus retreated from the upper littoral to deeper and cooler waters once its Tp was surpassed, identifying Tp as the ecological thermal boundary. Constraints in oxygen consumption at higher than critical temperatures (Tc) led to an exponential increase in mortality in all species. Exposure to short-term warming resulted in higher threshold values, consistent with a time dependence of thermal tolerance. In conclusion, species-specific limits to oxygen supply capacity are likely key in the onset of constraining (beyond pejus) and then life-threatening (beyond critical) conditions. Ecological consequences of these limits are mediated through behavioral plasticity in E. verrucosus. However, similar upper thermal limits in E. cyaneus (endemic, Baikal) and G. lacustris (ubiquitous, Holarctic) indicate that the potential invader G. lacustris would not necessarily benefit from rising temperatures. Secondary effects of increasing temperatures remain to be investigated.
Resumo:
The flux of organic particles below the mixed layer is one major pathway of carbon from the surface into the deep ocean. The magnitude of this export flux depends on two major processes-remineralization rates and sinking velocities. Here, we present an efficient method to measure sinking velocities of particles in the size range from approximately 3-400 µm by means of video microscopy (FlowCAM®). The method allows rapid measurement and automated analysis of mixed samples and was tested with polystyrene beads, different phytoplankton species, and sediment trap material. Sinking velocities of polystyrene beads were close to theoretical values calculated from Stokes' Law. Sinking velocities of the investigated phytoplankton species were in reasonable agreement with published literature values and sinking velocities of material collected in sediment trap increased with particle size. Temperature had a strong effect on sinking velocities due to its influence on seawater viscosity and density. An increase in 9 °C led to a measured increase in sinking velocities of 40 %. According to this temperature effect, an average temperature increase in 2 °C as projected for the sea surface by the end of this century could increase sinking velocities by about 6 % which might have feedbacks on carbon export into the deep ocean.
Resumo:
This paper examines the extent to which electricity supply constraints could affect sectoral specialization. For this purpose, an empirical trade model is estimated from 1990-2008 panel data on 15 OECD countries and 12 manufacturing sectors. We find that along with Ricardian technological differences and Heckscher-Ohlin factor-endowment differences, productivity-adjusted electricity capacity drives sectoral specialization in several sectors. Among them, electrical equipment, transport equipment, machinery, chemicals, and paper products will see lower output shares as a result of decreases in productivity-adjusted electricity capacity. Furthermore, our dynamic panel estimation reveals that the effects of Ricardian technological differences dominate in the short-run, and factor endowment differences and productivity-adjusted electricity capacity tend to have a significant effect in only the long-run.
Resumo:
Architectural decisions are often encoded in the form of constraints and guidelines. Non-functional requirements can be ensured by checking the conformance of the implementation against this kind of invariant. Conformance checking is often a costly and error-prone process that involves the use of multiple tools, differing in effectiveness, complexity and scope of applicability. To reduce the overall effort entailed by this activity, we propose a novel approach that supports verification of human- readable declarative rules through the use of adapted off-the-shelf tools. Our approach consists of a rule specification DSL, called Dicto, and a tool coordination framework, called Probo. The approach has been implemented in a soon to be evaluated prototype.
Resumo:
We discuss how integrity consistency constraints between different UML models can be precisely defined at a language level. In doing so, we introduce a formal object-oriented metamodeling approach. In the approach, integrity consistency constraints between UML models are defined in terms of invariants of the UML model elements used to define the models at the language-level. Adopting a formal approach, constraints are formally defined using Object-Z. We demonstrate how integrity consistency constraints for UML models can be precisely defined at the language-level and once completed, the formal description of the consistency constraints will be a precise reference of checking consistency of UML models as well as for tool development.
Resumo:
We propose the use of stochastic frontier approach to modelling financial constraints of firms. The main advantage of the stochastic frontier approach over the stylised approaches that use pooled OLS or fixed effects panel regression models is that we can not only decide whether or not the average firm is financially constrained, but also estimate a measure of the degree of the constraint for each firm and for each time period, and also the marginal impact of firm characteristics on this measure. We then apply the stochastic frontier approach to a panel of Indian manufacturing firms, for the 1997–2006 period. In our application, we highlight and discuss the aforementioned advantages, while also demonstrating that the stochastic frontier approach generates regression estimates that are consistent with the stylised intuition found in the literature on financial constraint and the wider literature on the Indian credit/capital market.
Resumo:
In this paper, we present a novel approach to modeling financing constraints of firms. Specifically, we adopt an approach in which firm-level investment is a nonparametric function of some relevant firm characteristics, cash flow in particular. This enables us to generate firm-year specific measures of cash flow sensitivity of investment. We are therefore able to draw conclusions about financing constraints of individual firms as well as cohorts of firms without having to split our sample on an ad hoc basis. This is a significant improvement over the stylized approach that is based on comparison of point estimates of cash flow sensitivity of investment of the average firm of ad hoc sub-samples of firms. We use firm-level data from India to highlight the advantages of our approach. Our results suggest that the estimates generated by this approach are meaningful from an economic point of view and are consistent with the literature. © 2014 © 2014 Taylor & Francis.
Resumo:
Nowadays, product development in all its phases plays a fundamental role in the industrial chain. The need for a company to compete at high levels, the need to be quick in responding to market demands and therefore to be able to engineer the product quickly and with a high level of quality, has led to the need to get involved in new more advanced methods/ processes. In recent years, we are moving away from the concept of 2D-based design and production and approaching the concept of Model Based Definition. By using this approach, increasingly complex systems turn out to be easier to deal with but above all cheaper in obtaining them. Thanks to the Model Based Definition it is possible to share data in a lean and simple way to the entire engineering and production chain of the product. The great advantage of this approach is precisely the uniqueness of the information. In this specific thesis work, this approach has been exploited in the context of tolerances with the aid of CAD / CAT software. Tolerance analysis or dimensional variation analysis is a way to understand how sources of variation in part size and assembly constraints propagate between parts and assemblies and how that range affects the ability of a project to meet its requirements. It is critically important to note how tolerance directly affects the cost and performance of products. Worst Case Analysis (WCA) and Statistical analysis (RSS) are the two principal methods in DVA. The thesis aims to show the advantages of using statistical dimensional analysis by creating and examining various case studies, using PTC CREO software for CAD modeling and CETOL 6σ for tolerance analysis. Moreover, it will be provided a comparison between manual and 3D analysis, focusing the attention to the information lost in the 1D case. The results obtained allow us to highlight the need to use this approach from the early stages of the product design cycle.
Resumo:
The Cananéia-Iguape system, SE Brazil, consists of a complex of lagoonal channels, located in a United Nations Educational, Scientific and Cultural Organization (UNESCO) Biosphere Reserve. Nevertheless, important environmental changes have occurred in approximately the last 150 yrs due to the opening of an artificial channel, the Valo Grande, connecting the Ribeira de Iguape River to the lagoonal system. Our objective is to assess the historical record of the uppermost layers of the sedimentary column of the lagoonal system in order to determine the history of environmental changes caused by the opening of the artificial channel. In this sense, an integrated geochemical-faunal approach is used. The environmental changes led significant modifications in salinity, in changes of the depositional patterns of sediments and foraminiferal assemblages (including periods of defaunation), and, more drastically, in the input of heavy metals to the coastal environment. The concentrations Pb in the core analyzed here were up to two times higher than the values measured in contaminated sediments from the Santos estuary, the most industrialized coastal zone in Brazil.
Resumo:
Background: Population antimicrobial use may influence resistance emergence. Resistance is an ecological phenomenon due to potential transmissibility. We investigated spatial and temporal patterns of ciprofloxacin (CIP) population consumption related to E. coli resistance emergence and dissemination in a major Brazilian city. A total of 4,372 urinary tract infection E. coli cases, with 723 CIP resistant, were identified in 2002 from two outpatient centres. Cases were address geocoded in a digital map. Raw CIP consumption data was transformed into usage density in DDDs by CIP selling points influence zones determination. A stochastic model coupled with a Geographical Information System was applied for relating resistance and usage density and for detecting city areas of high/low resistance risk. Results: E. coli CIP resistant cluster emergence was detected and significantly related to usage density at a level of 5 to 9 CIP DDDs. There were clustered hot-spots and a significant global spatial variation in the residual resistance risk after allowing for usage density. Conclusions: There were clustered hot-spots and a significant global spatial variation in the residual resistance risk after allowing for usage density. The usage density of 5-9 CIP DDDs per 1,000 inhabitants within the same influence zone was the resistance triggering level. This level led to E. coli resistance clustering, proving that individual resistance emergence and dissemination was affected by antimicrobial population consumption.
Resumo:
Aims. We calculate the theoretical event rate of gamma-ray bursts (GRBs) from the collapse of massive first-generation (Population III; Pop III) stars. The Pop III GRBs could be super-energetic with the isotropic energy up to E(iso) greater than or similar to 10(55-57) erg, providing a unique probe of the high-redshift Universe. Methods. We consider both the so-called Pop III.1 stars (primordial) and Pop III.2 stars (primordial but affected by radiation from other stars). We employ a semi-analytical approach that considers inhomogeneous hydrogen reionization and chemical evolution of the intergalactic medium. Results. We show that Pop III.2 GRBs occur more than 100 times more frequently than Pop III.1 GRBs, and thus should be suitable targets for future GRB missions. Interestingly, our optimistic model predicts an event rate that is already constrained by the current radio transient searches. We expect similar to 10-10(4) radio afterglows above similar to 0.3 mJy on the sky with similar to 1 year variability and mostly without GRBs (orphans), which are detectable by ALMA, EVLA, LOFAR, and SKA, while we expect to observe maximum of N < 20 GRBs per year integrated over at z > 6 for Pop III.2 and N < 0.08 per year integrated over at z > 10 for Pop III.1 with EXIST, and N < 0.2 for Pop III.2 GRBs per year integrated over at z > 6 with Swift.
Resumo:
The kinematic approach to cosmological tests provides direct evidence to the present accelerating stage of the Universe that does not depend on the validity of general relativity, as well as on the matter-energy content of the Universe. In this context, we consider here a linear two-parameter expansion for the decelerating parameter, q(z)=q(0)+q(1)z, where q(0) and q(1) are arbitrary constants to be constrained by the union supernovae data. By assuming a flat Universe we find that the best fit to the pair of free parameters is (q(0),q(1))=(-0.73,1.5) whereas the transition redshift is z(t)=0.49(-0.07)(+0.14)(1 sigma) +0.54-0.12(2 sigma). This kinematic result is in agreement with some independent analyses and more easily accommodates many dynamical flat models (like Lambda CDM).