887 resultados para Input and outputs


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Acquisition by Processing Theory (APT) is a unified account of language processing and learning that encompasses both L1 and L2 acquisition. Bold in aim and broad in scope, the proposal offers parsimony and comprehensiveness, both highly desirable in a theory of language acquisition. However, the sweep of the proposal is accompanied by an economy of description that makes it difficult to evaluate the validity of key learning claims, or even how literally they are to be interpreted. Two in particular deserve comment; the first concerns the learning mechanisms responsible for adding new L2 grammatical information, and the second the theoretical and empirical status of the activation concept used in the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is widely observed that the global geography of innovation is rapidly evolving. This paper presents evidence concerning the contemporary evolution of the globe's most productive regions. The paper uncovers the underlying structure and co-evolution of knowledge-based resources, capabilities and outputs across these regions. The analysis identifies two key trends by which the economic evolution and growth patterns of these regions are differentiated-namely, knowledge-based growth and labour market growth. The knowledge-based growth factor represents the underlying commonality found between the growth of economic output, earnings and a range of knowledge-based resources. The labour market growth factor represents the capability of regions to draw on their human capital. Overall, spectacular knowledge-based growth of leading Chinese regions is evident, highlighting a continued shift of knowledge-based resources to Asia. It is concluded that regional growth in knowledge production investment and the capacity to draw on regional human capital reserves are neither necessarily traded-off nor complementary to each other. © 2012 Urban Studies Journal Limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, we present a new multiproxy data set of terrigenous input, marine productivity and sea surface temperature (SST) from 52 surface sediment samples collected along E-W transects in the Pacific sector of the Southern Ocean. Allochtonous terrigenous input was characterized by the distribution of plant wax n-alkanes and soil-derived branched glycerol dialkyl glycerol tetraethers (brGDGTs). 230Th-normalized burial rates of both compound groups were highest close to the potential sources in Australia and New Zealand and are strongly related to lithogenic contents, indicating common sources and transport. Detection of both long-chain n-alkanes and brGDGTs at the most remote sites in the open ocean strongly suggests a primarily eolian transport mechanism to at least 110°W, i.e. by prevailing westerly winds. Two independent organic SST proxies were used, the UK'37 based on long-chain alkenones, and the TEX86 based on isoprenoid GDGTs. Both, UK'37 and TEX86 indices show robust relationships with temperature over a temperature range between 0.5 and 20°C, likely implying different seasonal and regional imprints on the temperature signal. While alkenone-based temperature estimates reliably reflect modern SST even at the low temperature end, large temperature residuals are observed for the polar ocean using the TEX86 index. 230Th-normalized burial rates of alkenones are highest close to the Subtropical Front and are positively related to lithogenic fluxes throughout the study area. In contrast, highest isoGDGT burial south of the Antarctic Polar Front is not related with dust flux but may be largely controlled by diatom blooms, and thus high opal fluxes during austral summer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Structural Health Monitoring (SHM) denotes a system with the ability to detect and interpret adverse changes in structures in order to improve reliability and reduce life-cycle costs. The greatest challenge for designing a SHM system is knowing what changes to look for and how to classify them. Different approaches for SHM have been proposed for damage identification, each one with advantages and drawbacks. This paper presents a methodology for improvement in vibration signal analysis using statistics information involving the probability density. Generally, the presence of noises in input and output signals results in false alarms, then, it is important that the methodology can minimize this problem. In this paper, the proposed approach is experimentally tested in a flexible plate using a piezoelectric (PZT) actuator to provide the disturbance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Water budget parameters are estimated for Shark River Slough (SRS), the main drainage within Everglades National Park (ENP) from 2002 to 2008. Inputs to the water budget include surface water inflows and precipitation while outputs consist of evapotranspiration, discharge to the Gulf of Mexico and seepage losses due to municipal wellfield extraction. The daily change in volume of SRS is equated to the difference between input and outputs yielding a residual term consisting of component errors and net groundwater exchange. Results predict significant net groundwater discharge to the SRS peaking in June and positively correlated with surface water salinity at the mangrove ecotone, lagging by 1 month. Precipitation, the largest input to the SRS, is offset by ET (the largest output); thereby highlighting the importance of increasing fresh water inflows into ENP for maintaining conditions in terrestrial, estuarine, and marine ecosystems of South Florida.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A multivariate fit to the variation in global mean surface air temperature anomaly over the past half century is presented. The fit procedure allows for the effect of response time on the waveform, amplitude and lag of each radiative forcing input, and each is allowed to have its own time constant. It is shown that the contribution of solar variability to the temperature trend since 1987 is small and downward; the best estimate is -1.3% and the 2sigma confidence level sets the uncertainty range of -0.7 to -1.9%. The result is the same if one quantifies the solar variation using galactic cosmic ray fluxes (for which the analysis can be extended back to 1953) or the most accurate total solar irradiance data composite. The rise in the global mean air surface temperatures is predominantly associated with a linear increase that represents the combined effects of changes in anthropogenic well-mixed greenhouse gases and aerosols, although, in recent decades, there is also a considerable contribution by a relative lack of major volcanic eruptions. The best estimate is that the anthropogenic factors contribute 75% of the rise since 1987, with an uncertainty range (set by the 2sigma confidence level using an AR(1) noise model) of 49–160%; thus, the uncertainty is large, but we can state that at least half of the temperature trend comes from the linear term and that this term could explain the entire rise. The results are consistent with the intergovernmental panel on climate change (IPCC) estimates of the changes in radiative forcing (given for 1961–1995) and are here combined with those estimates to find the response times, equilibrium climate sensitivities and pertinent heat capacities (i.e. the depth into the oceans to which a given radiative forcing variation penetrates) of the quasi-periodic (decadal-scale) input forcing variations. As shown by previous studies, the decadal-scale variations do not penetrate as deeply into the oceans as the longer term drifts and have shorter response times. Hence, conclusions about the response to century-scale forcing changes (and hence the associated equilibrium climate sensitivity and the temperature rise commitment) cannot be made from studies of the response to shorter period forcing changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper uses an aggregate quantity space to decompose the temporal changes in nitrogen use efficiency and cumulative exergy use efficiency into changes of Moorsteen–Bjurek (MB) Total Factor Productivity (TFP) changes and changes in the aggregate nitrogen and cumulative exergy contents. Changes in productivity can be broken into technical change and changes in various efficiency measures such as technical efficiency, scale efficiency and residual mix efficiency. Changes in the aggregate nitrogen and cumulative exergy contents can be driven by changes in the quality of inputs and outputs and changes in the mixes of inputs and outputs. Also with cumulative exergy content analysis, changes in the efficiency in input production can increase or decrease the cumulative exergy transformity of agricultural production. The empirical study in 30 member countries of the Organisation for Economic Co-operation Development from 1990 to 2003 yielded some important findings. The production technology progressed but there were reductions in technical efficiency, scale efficiency and residual mix efficiency levels. This result suggests that the production frontier had shifted up but there existed lags in the responses of member countries to the technological change. Given TFP growth, improvements in nutrient use efficiency and cumulative exergy use efficiency were counteracted by reductions in the changes of the aggregate nitrogen contents ratio and aggregate cumulative exergy contents ratio. The empirical results also confirmed that different combinations of inputs and outputs as well as the quality of inputs and outputs could have more influence on the growth of nutrient and cumulative exergy use efficiency than factors that had driven productivity change. Keywords: Nutrient use efficiency; Cumulative exergy use efficiency; Thermodynamic efficiency change; Productivity growth; OECD agriculture; Sustainability

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study estimates the environmental efficiency of international listed firms in 10 worldwide sectors from 2007 to 2013 by applying an order-m method, a non-parametric approach based on free disposal hull with subsampling bootstrapping. Using a conventional output of gross profit and two conventional inputs of labor and capital, this study examines the order-m environmental efficiency accounting for the presence of each of 10 undesirable inputs/outputs and measures the shadow prices of each undesirable input and output. The results show that there is greater potential for the reduction of undesirable inputs rather than bad outputs. On average, total energy, electricity, or water usage has the potential to be reduced by 50%. The median shadow prices of undesirable inputs, however, are much higher than the surveyed representative market prices. Approximately 10% of the firms in the sample appear to be potential sellers or production reducers in terms of undesirable inputs/outputs, which implies that the price of each item at the current level has little impact on most of the firms. Moreover, this study shows that the environmental, social, and governance activities of a firm do not considerably affect environmental efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper aims to present preliminary findings on measuring the technical efficiencies using Data Envelopment Analysis (DEA) in Malaysian Real Estate Investment Trusts (REITs) to determine the best practice for operations which include the asset allocation and scale size to improve the performance of Malaysian REITs. Variables identified as input and output will be assessed in this cross section analysis using the operational approach and Variable Return to Scale DEA (VRS-DEA) by focusing on Malaysian REITs for the year 2013. Islamic REITs have higher efficiency score as compared to the conventional REITs for both models. Diversified REITs are more efficient as compared to the specialised REIT using both models. For Model 1, the negative inefficient value is identified in the managerial inefficiency as compared to the scale inefficiency. This shows that inputs are not fully minimised to produce more outputs. However, when other expenses are considered as different input variables, the efficiency score becomes higher from 60.3% to 81.2%. In model 2, scale inefficiency produce greater inefficiency as compared to the managerial efficiency. The result suggests that Malaysian REITs have been operating at the wrong scale of operations as majority of the Malaysian REITs are operating at decreasing return to scale.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A spring-mass-lever (SML) model is introduced in this paper for a single-input-single-output compliant mechanism to capture its static and dynamic behavior. The SML model is a reduced-order model, and its five parameters provide physical insight and quantify the stiffness and inertia(1) at the input and output ports as well as the transformation of force and displacement between the input and output. The model parameters can be determined with reasonable accuracy without performing dynamic or modal analysis. The paper describes two uses of the SML model: computationally efficient analysis of a system of which the compliant mechanism is a part; and design of compliant mechanisms for the given user-specifications. During design, the SML model enables determining the feasible parameter space of user-specified requirements, assessing the suitability of a compliant mechanism to meet the user-specifications and also selecting and/or re-designing compliant mechanisms from an existing database. Manufacturing constraints, material choice, and other practical considerations are incorporated into this methodology. A micromachined accelerometer and a valve mechanism are used as examples to show the effectiveness of the SML model in analysis and design. (C) 2012 Published by Elsevier Ltd.