920 resultados para clause combining
Resumo:
Funding — Forest Enterprise Scotland and the University of Aberdeen provided funding for the project. The Carnegie Trust supported the lead author, E. McHenry, in this research through the award of a tuition fees bursary.
Resumo:
The chapter discusses both the complementary factors and contradictions of adopting ERP based systems with enterprise 2.0. ERP is characterized as achieving efficient business performance by enabling a standardized business process design, but at a cost of flexibility in operations. It is claimed that enterprise 2.0 can support flexible business process management and so incorporate informal and less structured interactions. A traditional view however is that efficiency and flexibility objectives are incompatible as they are different business objectives which are pursued separately in different organizational environments. Thus an ERP system with a primary objective of improving efficiency and an enterprise 2.0 system with a primary aim of improving flexibility may represent a contradiction and lead to a high risk of failure if adopted simultaneously. This chapter will use case study analysis to investigate the use of a combination of ERP and enterprise 2.0 in a single enterprise with the aim of improving both efficiency and flexibility in operations. The chapter provides an in-depth analysis of the combination of ERP with enterprise 2.0 based on social-technical information systems management theory. The chapter also provides a summary of the benefits of the combination of ERP systems and enterprise 2.0 and how they could contribute to the development of a new generation of business management that combines both formal and informal mechanisms. For example, the multiple-sites or informal communities of an enterprise could collaborate efficiently with a common platform with a certain level of standardization but also have the flexibility in order to provide an agile reaction to internal and external events.
Resumo:
This paper will look at the benefits and limitations of content distribution using Forward Error Correction (FEC) in conjunction with the Transmission Control Protocol (TCP). FEC can be used to reduce the number of retransmissions which would usually result from a lost packet. The requirement for TCP to deal with any losses is then greatly reduced. There are however side-effects to using FEC as a countermeasure to packet loss: an additional requirement for bandwidth. When applications such as real-time video conferencing are needed, delay must be kept to a minimum, and retransmissions are certainly not desirable. A balance, therefore, between additional bandwidth and delay due to retransmissions must be struck. Our results show that the throughput of data can be significantly improved when packet loss occurs using a combination of FEC and TCP, compared to relying solely on TCP for retransmissions. Furthermore, a case study applies the result to demonstrate the achievable improvements in the quality of streaming video perceived by end users.
Resumo:
Surveys can collect important data that inform policy decisions and drive social science research. Large government surveys collect information from the U.S. population on a wide range of topics, including demographics, education, employment, and lifestyle. Analysis of survey data presents unique challenges. In particular, one needs to account for missing data, for complex sampling designs, and for measurement error. Conceptually, a survey organization could spend lots of resources getting high-quality responses from a simple random sample, resulting in survey data that are easy to analyze. However, this scenario often is not realistic. To address these practical issues, survey organizations can leverage the information available from other sources of data. For example, in longitudinal studies that suffer from attrition, they can use the information from refreshment samples to correct for potential attrition bias. They can use information from known marginal distributions or survey design to improve inferences. They can use information from gold standard sources to correct for measurement error.
This thesis presents novel approaches to combining information from multiple sources that address the three problems described above.
The first method addresses nonignorable unit nonresponse and attrition in a panel survey with a refreshment sample. Panel surveys typically suffer from attrition, which can lead to biased inference when basing analysis only on cases that complete all waves of the panel. Unfortunately, the panel data alone cannot inform the extent of the bias due to attrition, so analysts must make strong and untestable assumptions about the missing data mechanism. Many panel studies also include refreshment samples, which are data collected from a random sample of new
individuals during some later wave of the panel. Refreshment samples offer information that can be utilized to correct for biases induced by nonignorable attrition while reducing reliance on strong assumptions about the attrition process. To date, these bias correction methods have not dealt with two key practical issues in panel studies: unit nonresponse in the initial wave of the panel and in the
refreshment sample itself. As we illustrate, nonignorable unit nonresponse
can significantly compromise the analyst's ability to use the refreshment samples for attrition bias correction. Thus, it is crucial for analysts to assess how sensitive their inferences---corrected for panel attrition---are to different assumptions about the nature of the unit nonresponse. We present an approach that facilitates such sensitivity analyses, both for suspected nonignorable unit nonresponse
in the initial wave and in the refreshment sample. We illustrate the approach using simulation studies and an analysis of data from the 2007-2008 Associated Press/Yahoo News election panel study.
The second method incorporates informative prior beliefs about
marginal probabilities into Bayesian latent class models for categorical data.
The basic idea is to append synthetic observations to the original data such that
(i) the empirical distributions of the desired margins match those of the prior beliefs, and (ii) the values of the remaining variables are left missing. The degree of prior uncertainty is controlled by the number of augmented records. Posterior inferences can be obtained via typical MCMC algorithms for latent class models, tailored to deal efficiently with the missing values in the concatenated data.
We illustrate the approach using a variety of simulations based on data from the American Community Survey, including an example of how augmented records can be used to fit latent class models to data from stratified samples.
The third method leverages the information from a gold standard survey to model reporting error. Survey data are subject to reporting error when respondents misunderstand the question or accidentally select the wrong response. Sometimes survey respondents knowingly select the wrong response, for example, by reporting a higher level of education than they actually have attained. We present an approach that allows an analyst to model reporting error by incorporating information from a gold standard survey. The analyst can specify various reporting error models and assess how sensitive their conclusions are to different assumptions about the reporting error process. We illustrate the approach using simulations based on data from the 1993 National Survey of College Graduates. We use the method to impute error-corrected educational attainments in the 2010 American Community Survey using the 2010 National Survey of College Graduates as the gold standard survey.
Resumo:
Taxonomies have gained a broad usage in a variety of fields due to their extensibility, as well as their use for classification and knowledge organization. Of particular interest is the digital document management domain in which their hierarchical structure can be effectively employed in order to organize documents into content-specific categories. Common or standard taxonomies (e.g., the ACM Computing Classification System) contain concepts that are too general for conceptualizing specific knowledge domains. In this paper we introduce a novel automated approach that combines sub-trees from general taxonomies with specialized seed taxonomies by using specific Natural Language Processing techniques. We provide an extensible and generalizable model for combining taxonomies in the practical context of two very large European research projects. Because the manual combination of taxonomies by domain experts is a highly time consuming task, our model measures the semantic relatedness between concept labels in CBOW or skip-gram Word2vec vector spaces. A preliminary quantitative evaluation of the resulting taxonomies is performed after applying a greedy algorithm with incremental thresholds used for matching and combining topic labels.
Resumo:
Taphonomic research of bones can provide additional insight into a site's formation and development, the burial environment and ongoing post-mortem processes. A total of 30 tortoise (Cylindraspis) femur bone samples from the Mare aux Songes site (Mauritius)were studied histologically, assessing parameters such as presence and type of microbial alteration, inclusions, staining/infiltrations, the degree of microcracking and birefringence. The absence of microbial attack in the 4200 year old Mare aux Songes bones suggests the animals rapidly entered the soil whole-bodied and were sealed anoxically, although they suffered frombiological and chemical degradation (i.e. pyrite formation/oxidation, mineral dissolution and staining) related to changes in the site's hydrology. Additionally, carbon and nitrogen stable isotopeswere analysed to obtain information on the animals' feeding behaviour. The results show narrowly distributed δ13C ratios, indicating a terrestrial C3 plant-based diet, combined with a wide range in δ15N ratios. This is most likely related to the tortoises' drought-adaptive ability to change their metabolic processes, which can affect the δ15N ratios. Furthermore, ZooMS collagen fingerprinting analysis successfully identified two tortoise species (C. triserrata and C. inepta) in the bone assemblage,which,when combined with stable isotope data, revealed significantly different δ15N ratios between the two tortoise species. As climatic changes around this period resulted in increased aridity in the Mascarene Islands, this could explain the extremely elevated δ15N ratio in our dataset. The endemic fauna was able to endure the climatic changes 4200 years ago, although human arrival in the 17th century changed the original habitat to such an extent that it resulted in the extinction of several species. Fortunately we are still able to study these extinct tortoises due to the beneficial conditions of their burial environment, resulting in excellent bone preservation.
Resumo:
[EN]Automatic detection systems do not perform as well as human observers, even on simple detection tasks. A potential solution to this problem is training vision systems on appropriate regions of interests (ROIs), in contrast to training on predefined and arbitrarily selected regions. Here we focus on detecting pedestrians in static scenes. Our aim is to answer the following question: Can automatic vision systems for pedestrian detection be improved by training them on perceptually-defined ROIs?
Resumo:
In the last decade, research in Computer Vision has developed several algorithms to help botanists and non-experts to classify plants based on images of their leaves. LeafSnap is a mobile application that uses a multiscale curvature model of the leaf margin to classify leaf images into species. It has achieved high levels of accuracy on 184 tree species from Northeast US. We extend the research that led to the development of LeafSnap along two lines. First, LeafSnap’s underlying algorithms are applied to a set of 66 tree species from Costa Rica. Then, texture is used as an additional criterion to measure the level of improvement achieved in the automatic identification of Costa Rica tree species. A 25.6% improvement was achieved for a Costa Rican clean image dataset and 42.5% for a Costa Rican noisy image dataset. In both cases, our results show this increment as statistically significant. Further statistical analysis of visual noise impact, best algorithm combinations per species, and best value of , the minimal cardinality of the set of candidate species that the tested algorithms render as best matches is also presented in this research
Resumo:
This dissertation is concerned with the control, combining, and propagation of laser beams through a turbulent atmosphere. In the first part we consider adaptive optics: the process of controlling the beam based on information of the current state of the turbulence. If the target is cooperative and provides a coherent return beam, the phase measured near the beam transmitter and adaptive optics can, in principle, correct these fluctuations. However, for many applications, the target is uncooperative. In this case, we show that an incoherent return from the target can be used instead. Using the principle of reciprocity, we derive a novel relation between the field at the target and the scattered field at a detector. We then demonstrate through simulation that an adaptive optics system can utilize this relation to focus a beam through atmospheric turbulence onto a rough surface. In the second part we consider beam combining. To achieve the power levels needed for directed energy applications it is necessary to combine a large number of lasers into a single beam. The large linewidths inherent in high-power fiber and slab lasers cause random phase and intensity fluctuations occurring on sub-nanosecond time scales. We demonstrate that this presents a challenging problem when attempting to phase-lock high-power lasers. Furthermore, we show that even if instruments are developed that can precisely control the phase of high-power lasers; coherent combining is problematic for DE applications. The dephasing effects of atmospheric turbulence typically encountered in DE applications will degrade the coherent properties of the beam before it reaches the target. Finally, we investigate the propagation of Bessel and Airy beams through atmospheric turbulence. It has been proposed that these quasi-non-diffracting beams could be resistant to the effects of atmospheric turbulence. However, we find that atmospheric turbulence disrupts the quasi-non-diffracting nature of Bessel and Airy beams when the transverse coherence length nears the initial aperture diameter or diagonal respectively. The turbulence induced transverse phase distortion limits the effectiveness of Bessel and Airy beams for applications requiring propagation over long distances in the turbulent atmosphere.
Resumo:
Due to trends in aero-design, aeroelasticity becomes increasingly important in modern turbomachines. Design requirements of turbomachines lead to the development of high aspect ratio blades and blade integral disc designs (blisks), which are especially prone to complex modes of vibration. Therefore, experimental investigations yielding high quality data are required for improving the understanding of aeroelastic effects in turbomachines. One possibility to achieve high quality data is to excite and measure blade vibrations in turbomachines. The major requirement for blade excitation and blade vibration measurements is to minimize interference with the aeroelastic effects to be investigated. Thus in this paper, a non-contact-and thus low interference-experimental set-up for exciting and measuring blade vibrations is proposed and shown to work. A novel acoustic system excites rotor blade vibrations, which are measured with an optical tip-timing system. By performing measurements in an axial compressor, the potential of the acoustic excitation method for investigating aeroelastic effects is explored. The basic principle of this method is described and proven through the analysis of blade responses at different acoustic excitation frequencies and at different rotational speeds. To verify the accuracy of the tip-timing system, amplitudes measured by tip-timing are compared with strain gage measurements. They are found to agree well. Two approaches to vary the nodal diameter (ND) of the excited vibration mode by controlling the acoustic excitation are presented. By combining the different excitable acoustic modes with a phase-lag control, each ND of the investigated 30 blade rotor can be excited individually. This feature of the present acoustic excitation system is of great benefit to aeroelastic investigations and represents one of the main advantages over other excitation methods proposed in the past. In future studies, the acoustic excitation method will be used to investigate aeroelastic effects in high-speed turbomachines in detail. The results of these investigations are to be used to improve the aeroelastic design of modern turbomachines.