914 resultados para Assortative matching
Resumo:
There is uncertainty over the potential changes to rainfall across northern Australia under climate change. Since rainfall is a key driver of pasture growth, cattle numbers and the resulting animal productivity and beef business profitability, the ability to anticipate possible management strategies within such uncertainty is crucial. The Climate Savvy Grazing project used existing research, expert knowledge and computer modelling to explore the best-bet management strategies within best, median and worse-case future climate scenarios. All three scenarios indicated changes to the environment and resources upon which the grazing industry of northern Australia depends. Well-adapted management strategies under a changing climate are very similar to best practice within current climatic conditions. Maintaining good land condition builds resource resilience, maximises opportunities under higher rainfall years and reduces the risk of degradation during drought and failed wet seasons. Matching stocking rate to the safe long-term carrying capacity of the land is essential; reducing stock numbers in response to poor seasons and conservatively increasing stock numbers in response to better seasons generally improves profitability and maintains land in good condition. Spelling over the summer growing season will improve land condition under a changing climate as it does under current conditions. Six regions were included within the project. Of these, the Victoria River District in the Northern Territory, Gulf country of Queensland and the Kimberley region of Western Australia had projections of similar or higher than current rainfall and the potential for carrying capacity to increase. The Alice Springs, Maranoa-Balonne and Fitzroy regions had projections of generally drying conditions and the greatest risk of reduced pasture growth and carrying capacity. Encouraging producers to consider and act on the risks, opportunities and management options inherent in climate change was a key goal of the project. More than 60,000 beef producers, advisors and stakeholders are now more aware of the management strategies which build resource resilience, and that resilience helps buffer against the effects of variable and changing climatic conditions. Over 700 producers have stated they have improved confidence, skills and knowledge to attempt new practices to build resilience. During the course of the project, more than 165 beef producers reported they have implemented changes to build resource and business resilience.
Resumo:
Mathematical models, for the stress analysis of symmetric multidirectional double cantilever beam (DCB) specimen using classical beam theory, first and higher-order shear deformation beam theories, have been developed to determine the Mode I strain energy release rate (SERR) for symmetric multidirectional composites. The SERR has been calculated using the compliance approach. In the present study, both variationally and nonvariationally derived matching conditions have been applied at the crack tip of DCB specimen. For the unidirectional and cross-ply composite DCB specimens, beam models under both plane stress and plane strain conditions in the width direction are applicable with good performance where as for the multidirectional composite DCB specimen, only the beam model under plane strain condition in the width direction appears to be applicable with moderate performance. Among the shear deformation beam theories considered, the performance of higher-order shear deformation beam theory, having quadratic variation for transverse displacement over the thickness, is superior in determining the SERR for multidirectional DCB specimen.
Resumo:
In this paper we consider the problem of computing an “optimal” popular matching. We assume that our input instance View the MathML source admits a popular matching and here we are asked to return not any popular matching but an optimal popular matching, where the definition of optimality is given as a part of the problem statement; for instance, optimality could be fairness in which case we are required to return a fair popular matching. We show an O(n2+m) algorithm for this problem, assuming that the preference lists are strict, where m is the number of edges in G and n is the number of applicants.
Resumo:
Network data packet capture and replay capabilities are basic requirements for forensic analysis of faults and security-related anomalies, as well as for testing and development. Cyber-physical networks, in which data packets are used to monitor and control physical devices, must operate within strict timing constraints, in order to match the hardware devices' characteristics. Standard network monitoring tools are unsuitable for such systems because they cannot guarantee to capture all data packets, may introduce their own traffic into the network, and cannot reliably reproduce the original timing of data packets. Here we present a high-speed network forensics tool specifically designed for capturing and replaying data traffic in Supervisory Control and Data Acquisition systems. Unlike general-purpose "packet capture" tools it does not affect the observed network's data traffic and guarantees that the original packet ordering is preserved. Most importantly, it allows replay of network traffic precisely matching its original timing. The tool was implemented by developing novel user interface and back-end software for a special-purpose network interface card. Experimental results show a clear improvement in data capture and replay capabilities over standard network monitoring methods and general-purpose forensics solutions.
Resumo:
The usual task in music information retrieval (MIR) is to find occurrences of a monophonic query pattern within a music database, which can contain both monophonic and polyphonic content. The so-called query-by-humming systems are a famous instance of content-based MIR. In such a system, the user's hummed query is converted into symbolic form to perform search operations in a similarly encoded database. The symbolic representation (e.g., textual, MIDI or vector data) is typically a quantized and simplified version of the sampled audio data, yielding to faster search algorithms and space requirements that can be met in real-life situations. In this thesis, we investigate geometric approaches to MIR. We first study some musicological properties often needed in MIR algorithms, and then give a literature review on traditional (e.g., string-matching-based) MIR algorithms and novel techniques based on geometry. We also introduce some concepts from digital image processing, namely the mathematical morphology, which we will use to develop and implement four algorithms for geometric music retrieval. The symbolic representation in the case of our algorithms is a binary 2-D image. We use various morphological pre- and post-processing operations on the query and the database images to perform template matching / pattern recognition for the images. The algorithms are basically extensions to classic image correlation and hit-or-miss transformation techniques used widely in template matching applications. They aim to be a future extension to the retrieval engine of C-BRAHMS, which is a research project of the Department of Computer Science at University of Helsinki.
Resumo:
Conventional methods for determining the refractive index demand specimens of optical quality, the preparation of which is often very difficult. An indirect determination by matching the refractive indices of specimen and immersion liquid is a practical alternative for photoelastic specimen of nonoptical quality. An experimental arrangement used for this technique and observations made while matching the refractive indices of three different specimens are presented.
Resumo:
This dissertation consists of an introductory section and three theoretical essays analyzing the interaction of corporate governance and restructuring. The essays adopt an incomplete contracts approach and analyze the role of different institutional designs to facilitate the alignment of the objectives of shareholders and management (or employees) over the magnitude of corporate restructuring. The first essay analyzes how a firm's choice of production technology affects the employees' human capital investment. In the essay, the owners of the firm can choose between a specific and a general technology that both require a costly human capital investment by the employees. The specific technology is initially superior in using the human capital of employees but, in contrast to the general technology, it is not compatible with future innovations. As a result, anticipated changes in the specific technology diminish the ex ante incentives of the employees to invest in human capital unless the shareholders grant the employees specific governance mechanisms (a right of veto, severance pay) so as to protect their investments. The results of the first essay indicate that the level of protection that the shareholders are willing to offer falls short of the socially desirable one. Furthermore, when restructuring opportunities become more abundant, it becomes more attractive both socially and from the viewpoint of the shareholders to initially adopt the general technology. The second essay analyzes how the allocation of authority within the firm interacts with the owners' choice of business strategy when the ability of the owners to monitor the project proposals of the management is biased in favor of the status quo strategy. The essay shows that a bias in the monitoring ability will affect not only the allocation of authority within the firm but also the choice of business strategy. Especially, when delegation has positive managerial incentive effects, delegation turns out to be more attractive under the new business strategy because the improved managerial incentives are a way for the owners to compensate their own reduced information gathering ability. This effect, however, simultaneously makes the owners hesitant to switch the strategy since it would involve a more frequent loss of control over the project choice. Consequently, the owners' lack of knowledge of the new business strategy may lead to a suboptimal choice of strategy. The third essay analyzes the implications of CEO succession process for the ideal board structure. In this essay, the presence of the departing CEO on the board facilitates the ability of the board to find a matching successor and to counsel him. However, the ex-CEO's presence may simultaneously also weaken the ability of the board to restructure since the predecessor may use the opportunity to distort the successor's project choice. The results of the essay suggest that the extent of restructuring gains, the firm's ability to hire good outside directors and the importance of board's advisory role affect at which point and for how long the shareholders may want to nominate the predecessor to the board.
Resumo:
In this paper, we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we describe two well-known mechanisms for sponsored search auction-Generalized Second Price (GSP) and Vickrey-Clarke-Groves (VCG). We then derive a new mechanism for sponsored search auction which we call optimal (OPT) mechanism. The OPT mechanism maximizes the search engine's expected revenue, while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We then undertake a detailed comparative study of the mechanisms GSP, VCG, and OPT. We compute and compare the expected revenue earned by the search engine under the three mechanisms when the advertisers are symmetric and some special conditions are satisfied. We also compare the three mechanisms in terms of incentive compatibility, individual rationality, and computational complexity. Note to Practitioners-The advertiser-supported web site is one of the successful business models in the emerging web landscape. When an Internet user enters a keyword (i.e., a search phrase) into a search engine, the user gets back a page with results, containing the links most relevant to the query and also sponsored links, (also called paid advertisement links). When a sponsored link is clicked, the user is directed to the corresponding advertiser's web page. The advertiser pays the search engine in some appropriate manner for sending the user to its web page. Against every search performed by any user on any keyword, the search engine faces the problem of matching a set of advertisers to the sponsored slots. In addition, the search engine also needs to decide on a price to be charged to each advertiser. Due to increasing demands for Internet advertising space, most search engines currently use auction mechanisms for this purpose. These are called sponsored search auctions. A significant percentage of the revenue of Internet giants such as Google, Yahoo!, MSN, etc., comes from sponsored search auctions. In this paper, we study two auction mechanisms, GSP and VCG, which are quite popular in the sponsored auction context, and pursue the objective of designing a mechanism that is superior to these two mechanisms. In particular, we propose a new mechanism which we call the OPT mechanism. This mechanism maximizes the search engine's expected revenue subject to achieving Bayesian incentive compatibility and individual rationality. Bayesian incentive compatibility guarantees that it is optimal for each advertiser to bid his/her true value provided that all other agents also bid their respective true values. Individual rationality ensures that the agents participate voluntarily in the auction since they are assured of gaining a non-negative payoff by doing so.
Resumo:
We investigate the use of a two stage transform vector quantizer (TSTVQ) for coding of line spectral frequency (LSF) parameters in wideband speech coding. The first stage quantizer of TSTVQ, provides better matching of source distribution and the second stage quantizer provides additional coding gain through using an individual cluster specific decorrelating transform and variance normalization. Further coding gain is shown to be achieved by exploiting the slow time-varying nature of speech spectra and thus using inter-frame cluster continuity (ICC) property in the first stage of TSTVQ method. The proposed method saves 3-4 bits and reduces the computational complexity by 58-66%, compared to the traditional split vector quantizer (SVQ), but at the expense of 1.5-2.5 times of memory.
Resumo:
Microsatellite markers have demonstrated their value for performing paternity exclusion and hence exploring mating patterns in plants and animals. Methodology is well established for diploid species, and several software packages exist for elucidating paternity in diploids; however, these issues are not so readily addressed in polyploids due to the increased complexity of the exclusion problem and a lack of available software. We introduce polypatex, an r package for paternity exclusion analysis using microsatellite data in autopolyploid, monoecious or dioecious/bisexual species with a ploidy of 4n, 6n or 8n. Given marker data for a set of offspring, their mothers and a set of candidate fathers, polypatex uses allele matching to exclude candidates whose marker alleles are incompatible with the alleles in each offspring–mother pair. polypatex can analyse marker data sets in which allele copy numbers are known (genotype data) or unknown (allelic phenotype data) – for data sets in which allele copy numbers are unknown, comparisons are made taking into account all possible genotypes that could arise from the compared allele sets. polypatex is a software tool that provides population geneticists with the ability to investigate the mating patterns of autopolyploids using paternity exclusion analysis on data from codominant markers having multiple alleles per locus.
Resumo:
The dissertation consists of an introductory chapter and three essays that apply search-matching theory to study the interaction of labor market frictions, technological change and macroeconomic fluctuations. The first essay studies the impact of capital-embodied growth on equilibrium unemployment by extending a vintage capital/search model to incorporate vintage human capital. In addition to the capital obsolescence (or creative destruction) effect that tends to raise unemployment, vintage human capital introduces a skill obsolescence effect of faster growth that has the opposite sign. Faster skill obsolescence reduces the value of unemployment, hence wages and leads to more job creation and less job destruction, unambiguously reducing unemployment. The second essay studies the effect of skill biased technological change on skill mismatch and the allocation of workers and firms in the labor market. By allowing workers to invest in education, we extend a matching model with two-sided heterogeneity to incorporate an endogenous distribution of high and low skill workers. We consider various possibilities for the cost of acquiring skills and show that while unemployment increases in most scenarios, the effect on the distribution of vacancy and worker types varies according to the structure of skill costs. When the model is extended to incorporate endogenous labor market participation, we show that the unemployment rate becomes less informative of the state of the labor market as the participation margin absorbs employment effects. The third essay studies the effects of labor taxes on equilibrium labor market outcomes and macroeconomic dynamics in a New Keynesian model with matching frictions. Three policy instruments are considered: a marginal tax and a tax subsidy to produce tax progression schemes, and a replacement ratio to account for variability in outside options. In equilibrium, the marginal tax rate and replacement ratio dampen economic activity whereas tax subsidies boost the economy. The marginal tax rate and replacement ratio amplify shock responses whereas employment subsidies weaken them. The tax instruments affect the degree to which the wage absorbs shocks. We show that increasing tax progression when taxation is initially progressive is harmful for steady state employment and output, and amplifies the sensitivity of macroeconomic variables to shocks. When taxation is initially proportional, increasing progression is beneficial for output and employment and dampens shock responses.
Resumo:
The opposed-jet diffusion flame has been considered with four step reaction kinetics for hydrogenoxygen system. The studies have revealed that the flame broadening reduces and maximum temperature increases as pressure increases. The relative importance of different reaction steps have been brought out in different regions (unstable, near extinction and equilibrium). The present studies have also led to the deduction of the oveall reaction rate constants of an equivalent single step reaction using matching of a certain overall set of parameters for four step reaction scheme and equivalent single step reaction.
Resumo:
In this paper, we present the design and characterization of a vibratory yaw rate MEMS sensor that uses in-plane motion for both actuation and sensing. The design criterion for the rate sensor is based on a high sensitivity and low bandwidth. The required sensitivity of the yawrate sensor is attained by using the inplane motion in which the dominant damping mechanism is the fluid loss due to slide film damping i.e. two-three orders of magnitude less than the squeeze-film damping in other rate sensors with out-of-plane motion. The low bandwidth is achieved by matching the drive and the sense mode frequencies. Based on these factors, the yaw rate sensor is designed and finally realized using surface micromachining. The inplane motion of the sensor is experimentally characterized to determine the sense and the drive mode frequencies, and corresponding damping ratios. It is found that the experimental results match well with the numerical and the analytical models with less than 5% error in frequencies measurements. The measured quality factor of the sensor is approximately 467, which is two orders of magnitude higher than that for a similar rate sensor with out-of-plane sense direction.
Resumo:
Phytoplankton ecology and productivity is one of the main branches of contemporary oceanographic research. Research groups in this branch have increasingly started to utilise bio-optical applications. My main research objective was to critically investigate the advantages and deficiencies of the fast repetition rate (FRR) fluorometry for studies of productivity of phytoplankton, and the responses of phytoplankton towards varying environmental stress. Second, I aimed to clarify the applicability of the FRR system to the optical environment of the Baltic Sea. The FRR system offers a highly dynamic tool for studies of phytoplankton photophysiology and productivity both in the field and in a controlled environment. The FRR metrics obtain high-frequency in situ determinations of the light-acclimative and photosynthetic parameters of intact phytoplankton communities. The measurement protocol is relatively easy to use without phases requiring analytical determinations. The most notable application of the FRR system lies in its potential for making primary productivity (PP) estimations. However, the realisation of this scheme is not straightforward. The FRR-PP, based on the photosynthetic electron flow (PEF) rate, are linearly related to the photosynthetic gas exchange (fixation of 14C) PP only in environments where the photosynthesis is light-limited. If the light limitation is not present, as is usually the case in the near-surface layers of the water column, the two PP approaches will deviate. The prompt response of the PEF rate to the short-term variability in the natural light field makes the field comparisons between the PEF-PP and the 14C-PP difficult to interpret, because this variability is averaged out in the 14C-incubations. Furthermore, the FRR based PP models are tuned to closely follow the vertical pattern of the underwater irradiance. Due to the photoacclimational plasticity of phytoplankton, this easily leads to overestimates of water column PP, if precautionary measures are not taken. Natural phytoplankton is subject to broad-waveband light. Active non-spectral bio-optical instruments, like the FRR fluorometer, emit light in a relatively narrow waveband, which by its nature does not represent the in situ light field. Thus, the spectrally-dependent parameters provided by the FRR system need to be spectrally scaled to the natural light field of the Baltic Sea. In general, the requirement of spectral scaling in the water bodies under terrestrial impact concerns all light-adaptive parameters provided by any active non-spectral bio-optical technique. The FRR system can be adopted to studies of all phytoplankton that possess efficient light harvesting in the waveband matching the bluish FRR excitation. Although these taxa cover the large bulk of all the phytoplankton taxa, one exception with a pronounced ecological significance is found in the Baltic Sea. The FRR system cannot be used to monitor the photophysiology of the cyanobacterial taxa harvesting light in the yellow-red waveband. These taxa include the ecologically-significant bloom-forming cyanobacterial taxa in the Baltic Sea.
Resumo:
We study the problem of matching applicants to jobs under one-sided preferences: that is, each applicant ranks a non-empty subset of jobs under an order of preference, possibly involving ties. A matching M is said to be rnore popular than T if the applicants that prefer M to T outnumber those that prefer T to M. A matching is said to be popular if there is no matching more popular than it. Equivalently, a matching M is popular if phi(M,T) >= phi(T, M) for all matchings T, where phi(X, Y) is the number of applicants that prefer X to Y. Previously studied solution concepts based oil the popularity criterion are either not guaranteed to exist for every instance (e.g., popular matchings) or are NP-hard to compute (e.g., least unpopular matchings). This paper addresses this issue by considering mixed matchings. A mixed matching is simply a probability distributions over matchings in the input graph. The function phi that compares two matchings generalizes in a natural manner to mixed matchings by taking expectation. A mixed matching P is popular if phi(P,Q) >= phi(Q,P) for all mixed matchings Q. We show that popular mixed matchings always exist. and we design polynomial time algorithms for finding them. Then we study their efficiency and give tight bounds on the price of anarchy and price of stability of the popular matching problem.