906 resultados para Returns to scale
Resumo:
The fast increase in the size and number of databases demands data mining approaches that are scalable to large amounts of data. This has led to the exploration of parallel computing technologies in order to perform data mining tasks concurrently using several processors. Parallelization seems to be a natural and cost-effective way to scale up data mining technologies. One of the most important of these data mining technologies is the classification of newly recorded data. This paper surveys advances in parallelization in the field of classification rule induction.
Resumo:
Generally classifiers tend to overfit if there is noise in the training data or there are missing values. Ensemble learning methods are often used to improve a classifier's classification accuracy. Most ensemble learning approaches aim to improve the classification accuracy of decision trees. However, alternative classifiers to decision trees exist. The recently developed Random Prism ensemble learner for classification aims to improve an alternative classification rule induction approach, the Prism family of algorithms, which addresses some of the limitations of decision trees. However, Random Prism suffers like any ensemble learner from a high computational overhead due to replication of the data and the induction of multiple base classifiers. Hence even modest sized datasets may impose a computational challenge to ensemble learners such as Random Prism. Parallelism is often used to scale up algorithms to deal with large datasets. This paper investigates parallelisation for Random Prism, implements a prototype and evaluates it empirically using a Hadoop computing cluster.
Resumo:
When villagers extract resources, such as fuelwood, fodder, or medicinal plants from forests, their decisions over where and how much to extract are influenced by market conditions, their particular opportunity costs of time, minimum consumption needs, and access to markets. This paper develops an optimization model of villagers’ extraction behavior that clarifies how, and under what conditions, policies that create incentives such as improved returns to extraction in a buffer zone might be used instead of adversarial enforcement efforts to protect a forest’s pristine ‘‘inner core.’’
Resumo:
Simulations from eleven coupled chemistry-climate models (CCMs) employing nearly identical forcings have been used to project the evolution of stratospheric ozone throughout the 21st century. The model-to-model agreement in projected temperature trends is good, and all CCMs predict continued, global mean cooling of the stratosphere over the next 5 decades, increasing from around 0.25 K/decade at 50 hPa to around 1 K/ decade at 1 hPa under the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) A1B scenario. In general, the simulated ozone evolution is mainly determined by decreases in halogen concentrations and continued cooling of the global stratosphere due to increases in greenhouse gases (GHGs). Column ozone is projected to increase as stratospheric halogen concentrations return to 1980s levels. Because of ozone increases in the middle and upper stratosphere due to GHGinduced cooling, total ozone averaged over midlatitudes, outside the polar regions, and globally, is projected to increase to 1980 values between 2035 and 2050 and before lower stratospheric halogen amounts decrease to 1980 values. In the polar regions the CCMs simulate small temperature trends in the first and second half of the 21st century in midwinter. Differences in stratospheric inorganic chlorine (Cly) among the CCMs are key to diagnosing the intermodel differences in simulated ozone recovery, in particular in the Antarctic. It is found that there are substantial quantitative differences in the simulated Cly, with the October mean Antarctic Cly peak value varying from less than 2 ppb to over 3.5 ppb in the CCMs, and the date at which the Cly returns to 1980 values varying from before 2030 to after 2050. There is a similar variation in the timing of recovery of Antarctic springtime column ozone back to 1980 values. As most models underestimate peak Cly near 2000, ozone recovery in the Antarctic could occur even later, between 2060 and 2070. In the Arctic the column ozone increase in spring does not follow halogen decreases as closely as in the Antarctic, reaching 1980 values before Arctic halogen amounts decrease to 1980 values and before the Antarctic. None of the CCMs predict future large decreases in the Arctic column ozone. By 2100, total column ozone is projected to be substantially above 1980 values in all regions except in the tropics.
Resumo:
The evolution of stratospheric ozone from 1960 to 2100 is examined in simulations from 14 chemistry‐climate models, driven by prescribed levels of halogens and greenhouse gases. There is general agreement among the models that total column ozone reached a minimum around year 2000 at all latitudes, projected to be followed by an increase over the first half of the 21st century. In the second half of the 21st century, ozone is projected to continue increasing, level off, or even decrease depending on the latitude. Separation into partial columns above and below 20 hPa reveals that these latitudinal differences are almost completely caused by differences in the model projections of ozone in the lower stratosphere. At all latitudes, upper stratospheric ozone increases throughout the 21st century and is projected to return to 1960 levels well before the end of the century, although there is a spread among models in the dates that ozone returns to specific historical values. We find decreasing halogens and declining upper atmospheric temperatures, driven by increasing greenhouse gases, contribute almost equally to increases in upper stratospheric ozone. In the tropical lower stratosphere, an increase in upwelling causes a steady decrease in ozone through the 21st century, and total column ozone does not return to 1960 levels in most of the models. In contrast, lower stratospheric and total column ozone in middle and high latitudes increases during the 21st century, returning to 1960 levels well before the end of the century in most models.
Resumo:
We evaluate the effects of spatial resolution on the ability of a regional climate model to reproduce observed extreme precipitation for a region in the Southwestern United States. A total of 73 National Climate Data Center observational sites spread throughout Arizona and New Mexico are compared with regional climate simulations at the spatial resolutions of 50 km and 10 km for a 31 year period from 1980 to 2010. We analyze mean, 3-hourly and 24-hourly extreme precipitation events using WRF regional model simulations driven by NCEP-2 reanalysis. The mean climatological spatial structure of precipitation in the Southwest is well represented by the 10 km resolution but missing in the coarse (50 km resolution) simulation. However, the fine grid has a larger positive bias in mean summer precipitation than the coarse-resolution grid. The large overestimation in the simulation is in part due to scale-dependent deficiencies in the Kain-Fritsch convective parameterization scheme that generate excessive precipitation and induce a slow eastward propagation of the moist convective summer systems in the high-resolution simulation. Despite this overestimation in the mean, the 10 km simulation captures individual extreme summer precipitation events better than the 50 km simulation. In winter, however, the two simulations appear to perform equally in simulating extremes.
Resumo:
This paper seeks to synthesise the various contributions to the special issue of Long Range Planning on competence-creating subsidiaries (CCS), and identifies avenues for future research. Effective competence-creation through a network of subsidiaries requires an appropriate balance between internal and external embeddedness. There are multiple types of firm-specific advantages (FSAs) essential to achieve this. In addition, wide-bandwidth pathways are needed with collaborators, suppliers, customers as well as internally within the MNE. Paradoxically, there is a natural tendency for bandwidth to shrink as dispersion increases. As distances (technological, organisational, and physical) become greater, there may be decreasing returns to R&D spread. Greater resources for knowledge integration and coordination are needed as intra-MNE and inter-firm R&D cooperation becomes more intensive and extensive. MNEs need to invest in mechanisms to promote wide-bandwidth knowledge flows, without which widely dispersed and networked MNEs can suffer from internal market failures.
Resumo:
p-(Dimethylamino)phenyl pentazole, DMAP-N5 (DMAP = Me2N−C6H4), was characterized by picosecond transient infrared spectroscopy and infrared spectroelectrochemistry. Femtosecond laser excitation at 310 or 330 nm produces the DMAP-N5 (S1) excited state, part of which returns to the ground state (τ = 82 ± 4 ps), while DMAP-N and DMAP-N3 (S0) are generated as double and single N2-loss photoproducts with η ≈ 0.14. The lifetime of DMAP-N5 (S1) is temperature and solvent dependent. [DMAP-N3]+ is produced from DMAP-N5 in a quasireversible, one-electron oxidation process (E1/2 = +0.67 V). Control experiments with DMAP-N3 support the findings. DFT B3LYP/6-311G** calculations were used to identify DMAP-N5 (S1), DMAP-N3 +, and DMAP-N in the infrared spectra. Both DMAP-N5 (S1) and [DMAP-N5]+ have a weakened N5 ring structure.
Resumo:
This paper uses a recently developed nonlinear Granger causality test to determine whether linear orthogonalization really does remove general stock market influences on real estate returns to leave pure industry effects in the latter. The results suggest that there is no nonlinear relationship between the US equity-based property index returns and returns on a general stock market index, although there is evidence of nonlinear causality for the corresponding UK series.
Resumo:
The early twentieth century constituted the heyday of the ‘breadwinner–homemaker’ household, characterized by a high degree of intra-household functional specialization between paid and domestic work according to age, gender, and marital status. This article examines the links between formal workforce participation and access to resources for individualized discretionary spending in British working-class households during the late 1930s, via an analysis of household leisure expenditures. Leisure spending is particularly salient to intra-household resource allocation, as it constitutes one of the most highly prioritized areas of individualized expenditure, especially for young, single people. Using a database compiled from surviving returns to the Ministry of Labour's national 1937/8 working-class expenditure survey, we examine leisure participation rates for over 600 households, using a detailed set of commercial leisure activities together with other relevant variables. We find that the employment status of family members other than the male breadwinner was a key factor influencing their access to commercial leisure. Our analysis thus supports the view that the breadwinner–homemaker household was characterized by strong power imbalances that concentrated resources—especially for individualized expenditures—in the hands of those family members who engaged in paid labour.
Resumo:
In this essay Alison Donnell returns to the material object of Edward Baugh's essay, published in the pages of the Trinidadian little magazine Tapia in 1977, in order to re-read the force of its arguments in the context of its own politicocultural history and to assess the significance of its publication venue. Donnell attends to Baugh's own standing in the highly charged field of Caribbean literary criticism as a critic of both Walcott and Naipaul, and acknowledges his creative contribution to this field as a poet. She also considers how, in the years between the original publication of Baugh's article and its republication, the questions of historical invisibility have entered newly disputed territories that demand attention to how gender, indigeneity, spirituality, and sexuality shape ideas of historical and literary legitimacy, in addition to those foundational questions around a politics of race and class.
Resumo:
This study presents an evaluation of the size and strength of convective updraughts in high-resolution simulations by the UK Met Office Unified Model (UM). Updraught velocities have been estimated from range–height indicator (RHI) Doppler velocity measurements using the Chilbolton advanced meteorological radar, as part of the Dynamical and Microphysical Evolution of Convective Storms (DYMECS) project. Based on mass continuity and the vertical integration of the observed radial convergence, vertical velocities tend to be underestimated for convective clouds due to the undetected cross-radial convergence. Velocity fields from the UM at a resolution corresponding to the radar observations are used to scale such estimates to mitigate the inherent biases. The analysis of more than 100 observed and simulated storms indicates that the horizontal scale of updraughts in simulations tend to decrease with grid length; the 200 m grid length agreed most closely with the observations. Typical updraught mass fluxes in the 500 m grid length simulations were up to an order of magnitude greater than observed, and greater still in the 1.5 km grid length simulations. The effect of increasing the mixing length in the sub-grid turbulence scheme depends on the grid length. For the 1.5 km simulations, updraughts were weakened though their horizontal scale remained largely unchanged. Progressively more so for the sub-kilometre grid lengths, updraughts were broadened and intensified; horizontal scale was now determined by the mixing length rather than the grid length. In general, simulated updraughts were found to weaken too quickly with height. The findings were supported by the analysis of the widths of reflectivity patterns in both the simulations and observations.
Resumo:
Research evaluating perceptual responses to music has identified many structural features as correlates that might be incorporated in computer music systems for affectively charged algorithmic composition and/or expressive music performance. In order to investigate the possible integration of isolated musical features to such a system, a discrete feature known to correlate some with emotional responses – rhythmic density – was selected from a literature review and incorporated into a prototype system. This system produces variation in rhythm density via a transformative process. A stimulus set created using this system was then subjected to a perceptual evaluation. Pairwise comparisons were used to scale differences between 48 stimuli. Listener responses were analysed with Multidimensional scaling (MDS). The 2-Dimensional solution was then rotated to place the stimuli with the largest range of variation across the horizontal plane. Stimuli with variation in rhythmic density were placed further from the source material than stimuli that were generated by random permutation. This, combined with the striking similarity between the MDS scaling and that of the 2-dimensional emotional model used by some affective algorithmic composition systems, suggests that isolated musical feature manipulation can now be used to parametrically control affectively charged automated composition in a larger system.
Resumo:
This article examines corporate governance in one of Fiji’s largest trust organisations, the Native Land Trust Board. The principal-agent framework is utilised to analyse the governance issue in this study. An examination of the annual reports and final accounts over the last three decades indicates that poor governance practices by the agent have resulted in the Board not delivering maximum returns to its principal, the landowners.
Resumo:
Tropical vegetation is a major source of global land surface evapotranspiration, and can thus play a major role in global hydrological cycles and global atmospheric circulation. Accurate prediction of tropical evapotranspiration is critical to our understanding of these processes under changing climate. We examined the controls on evapotranspiration in tropical vegetation at 21 pan-tropical eddy covariance sites, conducted a comprehensive and systematic evaluation of 13 evapotranspiration models at these sites, and assessed the ability to scale up model estimates of evapotranspiration for the test region of Amazonia. Net radiation was the strongest determinant of evapotranspiration (mean evaporative fraction was 0.72) and explained 87% of the variance in monthly evapotranspiration across the sites. Vapor pressure deficit was the strongest residual predictor (14%), followed by normalized difference vegetation index (9%), precipitation (6%) and wind speed (4%). The radiation-based evapotranspiration models performed best overall for three reasons: (1) the vegetation was largely decoupled from atmospheric turbulent transfer (calculated from X decoupling factor), especially at the wetter sites; (2) the resistance-based models were hindered by difficulty in consistently characterizing canopy (and stomatal) resistance in the highly diverse vegetation; (3) the temperature-based models inadequately captured the variability in tropical evapotranspiration. We evaluated the potential to predict regional evapotranspiration for one test region: Amazonia. We estimated an Amazonia-wide evapotranspiration of 1370 mm yr(-1), but this value is dependent on assumptions about energy balance closure for the tropical eddy covariance sites; a lower value (1096 mm yr(-1)) is considered in discussion on the use of flux data to validate and interpolate models.