422 resultados para pyramid HoG


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To move from the realm of good intent to verifiable practice, ethics needs to be approached in the same way as any other desired outcome of the public relations process: that is, operationalized and evaluated at each stage of a public relations campaign. A pyramid model—the "ethics pyramid" —is useful for incorporating ethical reflection and evaluation processes into the standard structure of a typical public relations plan. Practitioners can use it to integrate and manage ethical intent, means, and ends, by setting ethics objectives, considering the ethics of each campaign tactic, and reporting whether ethical outcomes have been attained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Testing for two-sample differences is challenging when the differences are local and only involve a small portion of the data. To solve this problem, we apply a multi- resolution scanning framework that performs dependent local tests on subsets of the sample space. We use a nested dyadic partition of the sample space to get a collection of windows and test for sample differences within each window. We put a joint prior on the states of local hypotheses that allows both vertical and horizontal message passing among the partition tree to reflect the spatial dependency features among windows. This information passing framework is critical to detect local sample differences. We use both the loopy belief propagation algorithm and MCMC to get the posterior null probability on each window. These probabilities are then used to report sample differences based on decision procedures. Simulation studies are conducted to illustrate the performance. Multiple testing adjustment and convergence of the algorithms are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the potential impact of ocean acidification on ecosystems such as coral reefs, surprisingly, there is very limited field data on the relationships between calcification and seawater carbonate chemistry. In this study, contemporaneous in situ datasets of seawater carbonate chemistry and calcification rates from the high-latitude coral reef of Bermuda over annual timescales provide a framework for investigating the present and future potential impact of rising carbon dioxide (CO2) levels and ocean acidification on coral reef ecosystems in their natural environment. A strong correlation was found between the in situ rates of calcification for the major framework building coral species Diploria labyrinthiformis and the seasonal variability of [CO32-] and aragonite saturation state omega aragonite, rather than other environmental factors such as light and temperature. These field observations provide sufficient data to hypothesize that there is a seasonal "Carbonate Chemistry Coral Reef Ecosystem Feedback" (CREF hypothesis) between the primary components of the reef ecosystem (i.e., scleractinian hard corals and macroalgae) and seawater carbonate chemistry. In early summer, strong net autotrophy from benthic components of the reef system enhance [CO32-] and omega aragonite conditions, and rates of coral calcification due to the photosynthetic uptake of CO2. In late summer, rates of coral calcification are suppressed by release of CO2 from reef metabolism during a period of strong net heterotrophy. It is likely that this seasonal CREF mechanism is present in other tropical reefs although attenuated compared to high-latitude reefs such as Bermuda. Due to lower annual mean surface seawater [CO32-] and omega aragonite in Bermuda compared to tropical regions, we anticipate that Bermuda corals will experience seasonal periods of zero net calcification within the next decade at [CO32-] and omega aragonite thresholds of ~184 micro moles kg-1 and 2.65. However, net autotrophy of the reef during winter and spring (as part of the CREF hypothesis) may delay the onset of zero NEC or decalcification going forward by enhancing [CO32-] and omega aragonite. The Bermuda coral reef is one of the first responders to the negative impacts of ocean acidification, and we estimate that calcification rates for D. labyrinthiformis have declined by >50% compared to pre-industrial times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The business system known as Pyramid does today not provide its user with a reasonable system regarding case management for support issues. The current system in place requires the customer to contact its provider via telephone to register new cases. In addition to this, current system doesn’t include any way for the user to view any of their current cases without contacting the provider.A solution to this issue is to migrate the current case management system from a telephone contact to a web based platform, where customers could easier access their current cases, but also directly through the website create new cases. This new system would reduce the time required to manually manage each individual case, for both customer and provider, resulting in an overall reduction in cost for both parties.The result is a system divided into two different sections, the first one is an API created in Pyramid that acts as a web service, and the second one a website which customers can connect to. The website will allow users to overview their current cases, but also the option to create new cases directly through the site. All the information used to the website is obtained through the web service inside Pyramid. Analyzing the final design of the system, the developers where able to conclude both positive and negative aspects of the systems’ final design. If the platform chosen was the optimal choice or not, and also what can be include if the system is further developed, will be discussed.The development process and the method used during development will also be analyzed and discussed, what positive and negative aspects that where encountered. In addition to this the cause and effect of a development team smaller than the suggested size will also be analyzed. Lastly an analysis of actions that could’ve been made in order to prevent certain issues from occurring will.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation examines a unique working class in the United States, the men and women who worked on the steamboats from the Industrial Revolution until the demise of steam-powered boats in the mid-20th century. The steamboat was the beginning of a technological system that was developed in America and used in such great numbers that it made the rapid population of the Trans-Appalachian West possible. The steamboat was forever romanticized by images of the antebellum South or the quick wit of Samuel Clemens and his sentimental book, Life on the Mississippi. The imagination swirls with thoughts of boats, bleach white, slowly churning the calm waters of some Spanish moss covered river. The reality of the boats and the experience of those who worked on them has been lost in this nostalgic vision. This research details the history of the western steamboat in the Monongahela Valley, the birthplace of the commercial steamboat industry. The first part of this dissertation examines the literature of authors in the field of labor history and Industrial Archaeology to place this work into the larger context of published literature. The second builds a framework for understanding the various eras that the steamboat went through both in terms of technological change, but also the change the workers experienced as their identity as a working class was being shaped. The third part details the excavations of two steamboat captains houses, those of Captain James Gormley and Captain Michael A. Cox. Both men represented a time in which the steamboat was in an era of transition. Excavations at their homes yield clues to their class status and how integrated they were in the local community. The fourth part of this study documents the oral histories of steamboat workers, both men and women, and their experience on the boats and on the river. Their rapidly declining population of those who lived and worked on the boats gives urgency for their lives to be documented. Finally, this study concludes with a synthesis of how worker identity solidified in the face of technological, socio-economic, and ideological change especially during their push for unionization and the introduction of the diesel towboat.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Impelled by neo-liberal ideology, base-of-the-pyramid (BOP) and subsistence market discourses have put emphases on markets, profits and entrepreneurialism. Because of this ideological mooring, there is a marginal understanding of the role of the State and its impact on the poor in these discourses. Franz Kafka’s work provides a critical perspective on the role of the State in BOP or subsistence settings. This ethnographic study in India examines transactions related to land and highlights the Kafkaesque nature of the State. The institutional setting is fraught with Kafkaesque elements such as inaccessible and indecipherable legality, abusive power relations and alienation of subaltern subjects. It further shows that the illicit character of the State is an important reason for illegal practices in subaltern settings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper assesses and compares the performances of two daylight collection strategies, one passive and one active, for large-scale mirrored light pipes (MLP) illuminating deep plan buildings. Both strategies use laser cut panels (LCP) as the main component of the collection system. The passive system comprises LCPs in pyramid form, whereas the active system uses a tiled LCP on a simple rotation mechanism that rotates 360° in 24 hours. Performance is assessed using scale model testing under sunny sky conditions and mathematical modelling. Results show average illuminance levels for the pyramid LCP ranging from 50 to 250 lux and 150 to 200 lux for the rotating LCPs. Both systems improve the performance of a MLP. The pyramid LCP increases the performance of a MLP by 2.5 times and the rotating LCP by 5 times, when compared to an open pipe particularly for low sun elevation angles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has been recognised that brands play a role in industrial markets, but to date a comprehensive model of business-to-business (B2B) branding does not exist, nor has there been an empirical study of the applicability of a full brand equity model in a B2B context. This paper is the first to begin to address these issues. The paper introduces the Customer- Based Brand Equity (CBBE) model by Kevin Keller (1993; 2001; 2003), and empirically tests its applicability in the market of electronic tracking systems for waste management. While Keller claims that the CBBE pyramid can be applied in a B2B context, this research highlights challenges of such an application, and suggests changes to the model are required. Assessing the equity of manufacturers’ brand names is more appropriate than measuring the equity of individual product brands as suggested by Keller. Secondly, the building blocks of Keller’s model appear useful in an organisational context, although differences in the subdimensions are required. Brand feelings appear to lack relevance in the industrial market investigated, and the pinnacle of Keller’s pyramid, resonance, needs serious modifications. Finally, company representatives play a role in building brand equity, indicating a need for this human element to be recognised in a B2B model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stereo vision is a method of depth perception, in which depth information is inferred from two (or more) images of a scene, taken from different perspectives. Practical applications for stereo vision include aerial photogrammetry, autonomous vehicle guidance, robotics and industrial automation. The initial motivation behind this work was to produce a stereo vision sensor for mining automation applications. For such applications, the input stereo images would consist of close range scenes of rocks. A fundamental problem faced by matching algorithms is the matching or correspondence problem. This problem involves locating corresponding points or features in two images. For this application, speed, reliability, and the ability to produce a dense depth map are of foremost importance. This work implemented a number of areabased matching algorithms to assess their suitability for this application. Area-based techniques were investigated because of their potential to yield dense depth maps, their amenability to fast hardware implementation, and their suitability to textured scenes such as rocks. In addition, two non-parametric transforms, the rank and census, were also compared. Both the rank and the census transforms were found to result in improved reliability of matching in the presence of radiometric distortion - significant since radiometric distortion is a problem which commonly arises in practice. In addition, they have low computational complexity, making them amenable to fast hardware implementation. Therefore, it was decided that matching algorithms using these transforms would be the subject of the remainder of the thesis. An analytic expression for the process of matching using the rank transform was derived from first principles. This work resulted in a number of important contributions. Firstly, the derivation process resulted in one constraint which must be satisfied for a correct match. This was termed the rank constraint. The theoretical derivation of this constraint is in contrast to the existing matching constraints which have little theoretical basis. Experimental work with actual and contrived stereo pairs has shown that the new constraint is capable of resolving ambiguous matches, thereby improving match reliability. Secondly, a novel matching algorithm incorporating the rank constraint has been proposed. This algorithm was tested using a number of stereo pairs. In all cases, the modified algorithm consistently resulted in an increased proportion of correct matches. Finally, the rank constraint was used to devise a new method for identifying regions of an image where the rank transform, and hence matching, are more susceptible to noise. The rank constraint was also incorporated into a new hybrid matching algorithm, where it was combined a number of other ideas. These included the use of an image pyramid for match prediction, and a method of edge localisation to improve match accuracy in the vicinity of edges. Experimental results obtained from the new algorithm showed that the algorithm is able to remove a large proportion of invalid matches, and improve match accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an iterative hierarchical algorithm for multi-view stereo. The algorithm attempts to utilise as much contextual information as is available to compute highly accurate and robust depth maps. There are three novel aspects to the approach: 1) firstly we incrementally improve the depth fidelity as the algorithm progresses through the image pyramid; 2) secondly we show how to incorporate visual hull information (when available) to constrain depth searches; and 3) we show how to simultaneously enforce the consistency of the depth-map by continual comparison with neighbouring depth-maps. We show that this approach produces highly accurate depth-maps and, since it is essentially a local method, is both extremely fast and simple to implement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Utilizing a mono-specific antiserum produced in rabbits to hog kidney aromatic L-amino acid decarboxylase (AADC), the enzyme was localized in rat kidney by immunoperoxidase staining. AADC was located predominantly in the proximal convoluted tubules; there was also weak staining in the distal convoluted tubules and collecting ducts. An increase in dietary potassium or sodium intake produced no change in density or distribution of AADC staining in kidney. An assay of AADC enzyme activity showed no difference in cortex or medulla with chronic potassium loading. A change in distribution or activity of renal AADC does not explain the postulated dopaminergic modulation of renal function that occurs with potassium or sodium loading.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 2004 Prahalad made managers aware of the great economic opportunity that the population at the BoP (Base of the Pyramid) represents for business in the form of new potential consumers. However, MNCs (Multi-National Corporations) generally continue to penetrate low income markets with the same strategies used at the top of the pyramid or choose not to invest at all in these regions because intimidated by having to re-envision their business models. The introduction of not re-arranged business models and products into developing countries has done nothing more over the years than induce new needs and develop new dependencies. By conducting a critical review of the literature this paper investigates and compares innovative approaches to operate in developing markets, which depart from the usual Corporate Social Responsibility marketing rhetoric, and rather consider the potential consumer at the BoP as a ring of continuity in the value chain − a resource that can itself produce value. Based on the concept of social embeddedness (London & Hart, 2004) and the principle that an open system contemplates different provisions (i.e. MNCs bring processes and technology, NGOs cultural mediating skills, governments laws and regulations, native people know-how and traditions), this paper concludes with a new business model reference that empowers all actors to contribute to value creation, while allowing MNCs to support local growth by turning what Prahalad called ‘inclusive capitalism’ into a more sustainable ‘inclusive entrepreneurial development’.