975 resultados para profitability calculation
Resumo:
The p-type carrier scattering rate due to alloy disorder in Si1-xGex alloys is obtained from first principles. The required alloy scattering matrix elements are calculated from the energy splitting of the valence bands, which arise when one average host atom is replaced by a Ge or Si atom in supercells containing up to 128 atoms. Alloy scattering within the valence bands is found to be characterized by a single scattering parameter. The hole mobility is calculated from the scattering rate using the Boltzmann transport equation in the relaxation time approximation. The results are in good agreement with experiments on bulk, unstrained alloys..
Resumo:
First-principles electronic structure methods are used to find the rates of intravalley and intervalley n-type carrier scattering due to alloy disorder in Si1-xGex alloys. The required alloy scattering matrix elements are calculated from the energy splitting of nearly degenerate Bloch states which arises when one average host atom is replaced by a Ge or Si atom in supercells containing up to 128 atoms. Scattering parameters for all relevant Delta and L intravalley and intervalley alloy scattering are calculated. Atomic relaxation is found to have a substantial effect on the scattering parameters. f-type intervalley scattering between Delta valleys is found to be comparable to other scattering channels. The n-type carrier mobility, calculated from the scattering rate using the Boltzmann transport equation in the relaxation time approximation, is in excellent agreement with experiments on bulk, unstrained alloys.
Resumo:
Human use of the oceans is increasingly in conflict with conservation of endangered species. Methods for managing the spatial and temporal placement of industries such as military, fishing, transportation and offshore energy, have historically been post hoc; i.e. the time and place of human activity is often already determined before assessment of environmental impacts. In this dissertation, I build robust species distribution models in two case study areas, US Atlantic (Best et al. 2012) and British Columbia (Best et al. 2015), predicting presence and abundance respectively, from scientific surveys. These models are then applied to novel decision frameworks for preemptively suggesting optimal placement of human activities in space and time to minimize ecological impacts: siting for offshore wind energy development, and routing ships to minimize risk of striking whales. Both decision frameworks relate the tradeoff between conservation risk and industry profit with synchronized variable and map views as online spatial decision support systems.
For siting offshore wind energy development (OWED) in the U.S. Atlantic (chapter 4), bird density maps are combined across species with weights of OWED sensitivity to collision and displacement and 10 km2 sites are compared against OWED profitability based on average annual wind speed at 90m hub heights and distance to transmission grid. A spatial decision support system enables toggling between the map and tradeoff plot views by site. A selected site can be inspected for sensitivity to a cetaceans throughout the year, so as to capture months of the year which minimize episodic impacts of pre-operational activities such as seismic airgun surveying and pile driving.
Routing ships to avoid whale strikes (chapter 5) can be similarly viewed as a tradeoff, but is a different problem spatially. A cumulative cost surface is generated from density surface maps and conservation status of cetaceans, before applying as a resistance surface to calculate least-cost routes between start and end locations, i.e. ports and entrance locations to study areas. Varying a multiplier to the cost surface enables calculation of multiple routes with different costs to conservation of cetaceans versus cost to transportation industry, measured as distance. Similar to the siting chapter, a spatial decisions support system enables toggling between the map and tradeoff plot view of proposed routes. The user can also input arbitrary start and end locations to calculate the tradeoff on the fly.
Essential to the input of these decision frameworks are distributions of the species. The two preceding chapters comprise species distribution models from two case study areas, U.S. Atlantic (chapter 2) and British Columbia (chapter 3), predicting presence and density, respectively. Although density is preferred to estimate potential biological removal, per Marine Mammal Protection Act requirements in the U.S., all the necessary parameters, especially distance and angle of observation, are less readily available across publicly mined datasets.
In the case of predicting cetacean presence in the U.S. Atlantic (chapter 2), I extracted datasets from the online OBIS-SEAMAP geo-database, and integrated scientific surveys conducted by ship (n=36) and aircraft (n=16), weighting a Generalized Additive Model by minutes surveyed within space-time grid cells to harmonize effort between the two survey platforms. For each of 16 cetacean species guilds, I predicted the probability of occurrence from static environmental variables (water depth, distance to shore, distance to continental shelf break) and time-varying conditions (monthly sea-surface temperature). To generate maps of presence vs. absence, Receiver Operator Characteristic (ROC) curves were used to define the optimal threshold that minimizes false positive and false negative error rates. I integrated model outputs, including tables (species in guilds, input surveys) and plots (fit of environmental variables, ROC curve), into an online spatial decision support system, allowing for easy navigation of models by taxon, region, season, and data provider.
For predicting cetacean density within the inner waters of British Columbia (chapter 3), I calculated density from systematic, line-transect marine mammal surveys over multiple years and seasons (summer 2004, 2005, 2008, and spring/autumn 2007) conducted by Raincoast Conservation Foundation. Abundance estimates were calculated using two different methods: Conventional Distance Sampling (CDS) and Density Surface Modelling (DSM). CDS generates a single density estimate for each stratum, whereas DSM explicitly models spatial variation and offers potential for greater precision by incorporating environmental predictors. Although DSM yields a more relevant product for the purposes of marine spatial planning, CDS has proven to be useful in cases where there are fewer observations available for seasonal and inter-annual comparison, particularly for the scarcely observed elephant seal. Abundance estimates are provided on a stratum-specific basis. Steller sea lions and harbour seals are further differentiated by ‘hauled out’ and ‘in water’. This analysis updates previous estimates (Williams & Thomas 2007) by including additional years of effort, providing greater spatial precision with the DSM method over CDS, novel reporting for spring and autumn seasons (rather than summer alone), and providing new abundance estimates for Steller sea lion and northern elephant seal. In addition to providing a baseline of marine mammal abundance and distribution, against which future changes can be compared, this information offers the opportunity to assess the risks posed to marine mammals by existing and emerging threats, such as fisheries bycatch, ship strikes, and increased oil spill and ocean noise issues associated with increases of container ship and oil tanker traffic in British Columbia’s continental shelf waters.
Starting with marine animal observations at specific coordinates and times, I combine these data with environmental data, often satellite derived, to produce seascape predictions generalizable in space and time. These habitat-based models enable prediction of encounter rates and, in the case of density surface models, abundance that can then be applied to management scenarios. Specific human activities, OWED and shipping, are then compared within a tradeoff decision support framework, enabling interchangeable map and tradeoff plot views. These products make complex processes transparent for gaming conservation, industry and stakeholders towards optimal marine spatial management, fundamental to the tenets of marine spatial planning, ecosystem-based management and dynamic ocean management.
Resumo:
This research examined the factors contributing to the performance of online grocers prior to, and following, the 2000 dot.com collapse. The primary goals were to assess the relationship between a company’s business model(s) and its performance in the online grocery channel and to determine if there were other company and/or market related factors that could account for company performance. To assess the primary goals, a case based theory building process was utilized. A three-way cross-case analysis comprising Peapod, GroceryWorks, and Tesco examined the common profit components, the structural category (e.g., pure-play, partnership, and hybrid) profit components, and the idiosyncratic profit components related to each specific company. Based on the analysis, it was determined that online grocery store business models could be represented at three distinct, but hierarchically, related levels. The first level was termed the core model and represented the basic profit structure that all online grocers needed in order to conduct operations. The next model level was termed the structural model and represented the profit structure associated with the specific business model configuration (i.e., pure-play, partnership, hybrid). The last model level was termed the augmented model and represented the company’s business model when idiosyncratic profit components were included. In relation to the five company related factors, scalability, rate of expansion, and the automation level were potential candidates for helping to explain online grocer performance. In addition, all the market structure related factors were deemed possible candidates for helping to explain online grocer performance. The study concluded by positing an alternative hypothesis concerning the performance of online grocers. Prior to this study, the prevailing wisdom was that the business models were the primary cause of online grocer performance. However, based on the core model analysis, it was hypothesized that the customer relationship activities (i.e., advertising, promotions, and loyalty program tie-ins) were the real drivers of online grocer performance.
Resumo:
Bulk delta15N values in surface sediment samples off the southwestern coast of Africa were measured to investigate the biogeochemical processes occurring in the water column. Nitrate concentrations and the degree of utilization of the nitrate pool are the predominant controls on sedimentary delta15N in the Benguela Current region. Denitrification does not appear to have had an important effect on the delta15N signal of these sediments and, based on delta15N and delta13C, there is little terrestrial input.
Resumo:
This article presents the construction of Social Profitability Index in Communication (IRSCOM), which aims to collect values linked to the operation of the media, rejecting the mercantilist vision and enhancing citizen participation and transparency in its management. This indicator is a proposal that seeks to correct deficiencies in the social profitability of the media to consolidate models that respond to logic focused on building democracy, the strength of plurality and diversity.
Resumo:
A regional offset (ΔR) from the marine radiocarbon calibration curve is widely used in calibration software (eg CALIB, OxCal) but often is not calculated correctly. While relatively straightforward for known age samples, such as mollusks from museum collections or banded corals, it is more difficult to calculate ΔR and the uncertainty in ΔR for 14C dates on paired marine and terrestrial samples. Previous researchers have often utilized classical intercept methods (Reimer et al. 2002; Dewar et al. 2012, Russell et al. 2011) but this does not account for the full calibrated probability density function (PDF). We have developed an on-line application for performing these calculations for known age, paired marine and terrestrial 14C dates, or U-Th dated corals which is available at http://calib.qub.ac.uk/deltar
Resumo:
Multiphase flows, type oil–water-gas are very common among different industrial activities, such as chemical industries and petroleum extraction, and its measurements show some difficulties to be taken. Precisely determining the volume fraction of each one of the elements that composes a multiphase flow is very important in chemical plants and petroleum industries. This work presents a methodology able to determine volume fraction on Annular and Stratified multiphase flow system with the use of neutrons and artificial intelligence, using the principles of transmission/scattering of fast neutrons from a 241Am-Be source and measurements of point flow that are influenced by variations of volume fractions. The proposed geometries used on the mathematical model was used to obtain a data set where the thicknesses referred of each material had been changed in order to obtain volume fraction of each phase providing 119 compositions that were used in the simulation with MCNP-X –computer code based on Monte Carlo Method that simulates the radiation transport. An artificial neural network (ANN) was trained with data obtained using the MCNP-X, and used to correlate such measurements with the respective real fractions. The ANN was able to correlate the data obtained on the simulation with MCNP-X with the volume fractions of the multiphase flows (oil-water-gas), both in the pattern of annular flow as stratified, resulting in a average relative error (%) for each production set of: annular (air= 3.85; water = 4.31; oil=1.08); stratified (air=3.10, water 2.01, oil = 1.45). The method demonstrated good efficiency in the determination of each material that composes the phases, thus demonstrating the feasibility of the technique.
Resumo:
With the objective to improve the reactor physics calculation on a 2D and 3D nuclear reactor via the Diffusion Equation, an adaptive automatic finite element remeshing method, based on the elementary area (2D) or volume (3D) constraints, has been developed. The adaptive remeshing technique, guided by a posteriori error estimator, makes use of two external mesh generator programs: Triangle and TetGen. The use of these free external finite element mesh generators and an adaptive remeshing technique based on the current field continuity show that they are powerful tools to improve the neutron flux distribution calculation and by consequence the power solution of the reactor core even though they have a minor influence on the critical coefficient of the calculated reactor core examples. Two numerical examples are presented: the 2D IAEA reactor core numerical benchmark and the 3D model of the Argonauta research reactor, built in Brasil.
Resumo:
Sharp edges were first used for field ionisation mass spectrometry by Beckey. Although Cross and Robertson found that etched metal foils were more effective than razor blades for field ionisation, blades are very convenient for determination of field ionisation mass spectra, as reported by Robertson and Viney. The electric field at the vertex of a sharp edge can be calculated by the method of the conformal transformation. Here we give some equations for the field deduced with the assumption that the edge surface can be approximated by a hyperbola. We also compare two hyperbolae with radii of curvature at the vertex of 500 Angstrom and 1000 Angstrom with the profile of a commercial carbon-steel razor blade.
Resumo:
In Queensland the subtropical strawberry (Fragaria ×ananassa) breeding program aims to combine traits into new genotypes that increase production efficiency. The contribution of individual plant traits to cost and income under subtropical Queensland conditions has been investigated. The study adapted knowledge of traits and the production and marketing system to assess the economic impact (gross margin) of new cultivars on the system, with the overall goal of improving the profitability of the industry through the release of new strawberry cultivars. Genotypes varied widely in their effect on gross margin, from 48% above to 10% below the base value. The advantage of a new genotype was also affected by the proportion of total area allocated to the new genotype. The largest difference in gross margin between that at optimum allocation (8% increase in gross margin) and an all of industry allocation (20% decrease in gross margin) of area to the genotype was 28%. While in other cases the all of industry allocation was also the optimum allocation, with one genotype giving a 48% benefit in gross margin.
Resumo:
The electricity market and climate are both undergoing a change. The changes impact hydropower and provoke an interest for hydropower capacity increases. In this thesis a new methodology was developed utilising short-term hydropower optimisation and planning software for better capacity increase profitability analysis accuracy. In the methodology income increases are calculated in month long periods while varying average discharge and electricity price volatility. The monthly incomes are used for constructing year scenarios, and from different types of year scenarios a long-term profitability analysis can be made. Average price development is included utilising a multiplier. The method was applied on Oulujoki hydropower plants. It was found that the capacity additions that were analysed for Oulujoki were not profitable. However, the methodology was found versatile and useful. The result showed that short periods of peaking prices play major role in the profitability of capacity increases. Adding more discharge capacity to hydropower plants that initially bypassed water more often showed the best improvements both in income and power generation profile flexibility.