950 resultados para Search Strategies


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report provides a systematic review of the most economically damaging endemic diseases and conditions for the Australian red meat industry (cattle, sheep and goats). A number of diseases for cattle, sheep and goats have been identified and were prioritised according to their prevalence, distribution, risk factors and mitigation. The economic cost of each disease as a result of production losses, preventive costs and treatment costs is estimated at the herd and flock level, then extrapolated to a national basis using herd/flock demographics from the 2010-11 Agricultural Census by the Australian Bureau of Statistics. Information shortfalls and recommendations for further research are also specified. A total of 17 cattle, 23 sheep and nine goat diseases were prioritised based on feedback received from producer, government and industry surveys, followed by discussions between the consultants and MLA. Assumptions of disease distribution, in-herd/flock prevalence, impacts on mortality/production and costs for prevention and treatment were obtained from the literature where available. Where these data were not available, the consultants used their own expertise to estimate the relevant measures for each disease. Levels of confidence in the assumptions for each disease were estimated, and gaps in knowledge identified. The assumptions were analysed using a specialised Excel model that estimated the per animal, herd/flock and national costs of each important disease. The report was peer reviewed and workshopped by the consultants and experts selected by MLA before being finalised. Consequently, this report is an important resource that will guide and prioritise future research, development and extension activities by a variety of stakeholders in the red meat industry. This report completes Phase I and Phase II of an overall four-Phase project initiative by MLA, with identified data gaps in this report potentially being addressed within the later phases. Modelling the economic costs using a consistent approach for each disease ensures that the derived estimates are transparent and can be refined if improved data on prevalence becomes available. This means that the report will be an enduring resource for developing policies and strategies for the management of endemic diseases within the Australian red meat industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project proposes to implement resistance gene pyramiding strategies through close collaboration with Pacific Seeds. These strategies have been developed by Department of Primary Industries and Fisheries (DPI&F) researchers in two previous GRDC projects, DAQ356 and DAQ537. The gene pyramids will be incorporated into elite breeding material using techniques and technologies developed by DPI&F. These include the use of DNA markers. If successful, a range of elite lines/commercial hybrids containing strategic resistance gene pyramids will be available to growers. These lines will provide the industry with a directed strategy to manage the sunflower rust pathogen and reduce the risk of outbreaks of the disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Low level strategic supplements constitute one of the few options for northern beef producers to increase breeder productivity and profitability. Objectives of the project were to improve the cost-effectiveness of using such supplements and to improve supplement delivery systems. Urea-based supplements fed during the dry season can substantially reduce breeder liveweight loss and increase fertility during severe dry seasons. Also when fed during the late wet season these supplements increased breeder body liveweight and increased fertility of breeders in low body condition. Intake of dry lick supplements fed free choice is apparently determined primarily by the palatability of supplements relative to pasture, and training of cattle appears to be of limited importance. Siting of supplementation points has some effect on supplement intake, but little effect on grazing behaviour. Economic analysis of supplementation (urea, phosphorus or molasses) and weaning strategies was based on the relative efficacy of these strategies to maintain breeder body condition late in the dry season. Adequate body condition of breeders at this time of the year is needed to avoid mortality from under-nutrition and achieve satisfactory fertility of breeders during the following wet season. Supplements were highly cost-effective when they reduced mortality, but economic returns were generally low if the only benefit was increased fertility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Progress in crop improvement is limited by the ability to identify favourable combinations of genotypes (G) and management practices (M) in relevant target environments (E) given the resources available to search among the myriad of possible combinations. To underpin yield advance we require prediction of phenotype based on genotype. In plant breeding, traditional phenotypic selection methods have involved measuring phenotypic performance of large segregating populations in multi-environment trials and applying rigorous statistical procedures based on quantitative genetic theory to identify superior individuals. Recent developments in the ability to inexpensively and densely map/sequence genomes have facilitated a shift from the level of the individual (genotype) to the level of the genomic region. Molecular breeding strategies using genome wide prediction and genomic selection approaches have developed rapidly. However, their applicability to complex traits remains constrained by gene-gene and gene-environment interactions, which restrict the predictive power of associations of genomic regions with phenotypic responses. Here it is argued that crop ecophysiology and functional whole plant modelling can provide an effective link between molecular and organism scales and enhance molecular breeding by adding value to genetic prediction approaches. A physiological framework that facilitates dissection and modelling of complex traits can inform phenotyping methods for marker/gene detection and underpin prediction of likely phenotypic consequences of trait and genetic variation in target environments. This approach holds considerable promise for more effectively linking genotype to phenotype for complex adaptive traits. Specific examples focused on drought adaptation are presented to highlight the concepts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method that yields optical Barker codes of smallest known lengths for given discrimination is described.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The following problem is considered. Given the locations of the Central Processing Unit (ar;the terminals which have to communicate with it, to determine the number and locations of the concentrators and to assign the terminals to the concentrators in such a way that the total cost is minimized. There is alao a fixed cost associated with each concentrator. There is ail upper limit to the number of terminals which can be connected to a concentrator. The terminals can be connected directly to the CPU also In this paper it is assumed that the concentrators can bo located anywhere in the area A containing the CPU and the terminals. Then this becomes a multimodal optimization problem. In the proposed algorithm a stochastic automaton is used as a search device to locate the minimum of the multimodal cost function . The proposed algorithm involves the following. The area A containing the CPU and the terminals is divided into an arbitrary number of regions (say K). An approximate value for the number of concentrators is assumed (say m). The optimum number is determined by iteration later The m concentrators can be assigned to the K regions in (mk) ways (m > K) or (km) ways (K>m).(All possible assignments are feasible, i.e. a region can contain 0,1,…, to concentrators). Each possible assignment is assumed to represent a state of the stochastic variable structure automaton. To start with, all the states are assigned equal probabilities. At each stage of the search the automaton visits a state according to the current probability distribution. At each visit the automaton selects a 'point' inside that state with uniform probability. The cost associated with that point is calculated and the average cost of that state is updated. Then the probabilities of all the states are updated. The probabilities are taken to bo inversely proportional to the average cost of the states After a certain number of searches the search probabilities become stationary and the automaton visits a particular state again and again. Then the automaton is said to have converged to that state Then by conducting a local gradient search within that state the exact locations of the concentrators are determined This algorithm was applied to a set of test problems and the results were compared with those given by Cooper's (1964, 1967) EAC algorithm and on the average it was found that the proposed algorithm performs better.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exposure to hot environments affects milk yield (MY) and milk composition of pasture and feed-pad fed dairy cows in subtropical regions. This study was undertaken during summer to compare MY and physiology of cows exposed to six heat-load management treatments. Seventy-eight Holstein-Friesian cows were blocked by season of calving, parity, milk yield, BW, and milk protein (%) and milk fat (%) measured in 2 weeks prior to the start of the study. Within blocks, cows were randomly allocated to one of the following treatments: open-sided iron roofed day pen adjacent to dairy (CID) + sprinklers (SP); CID only; non-shaded pen adjacent to dairy + SP (NSD + SP); open-sided shade cloth roofed day pen adjacent to dairy (SCD); NSD + sprinkler (sprinkler on for 45 min at 1100 h if mean respiration rate >80 breaths per minute (NSD + WSP)); open-sided shade cloth roofed structure over feed bunk in paddock + 1 km walk to and from the dairy (SCP + WLK). Sprinklers for CID + SP and NSD + SP cycled 2 min on, 12 min off when ambient temperature >26°C. The highest milk yields were in the CID + SP and CID treatments (23.9 L cow−1 day−1), intermediate for NSD + SP, SCD and SCP + WLK (22.4 L cow−1 day−1), and lowest for NSD + WSP (21.3 L cow−1 day−1) (P < 0.05). The highest (P < 0.05) feed intakes occurred in the CID + SP and CID treatments while intake was lowest (P < 0.05) for NSD + WSP and SCP + WLK. Weather data were collected on site at 10-min intervals, and from these, THI was calculated. Nonlinear regression modelling of MY × THI and heat-load management treatment demonstrated that cows in CID + SP showed no decline in MY out to a THI break point value of 83.2, whereas the pooled MY of the other treatments declined when THI >80.7. A combination of iron roof shade plus water sprinkling throughout the day provided the most effective control of heat load.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current smartphones have a storage capacity of several gigabytes. More and more information is stored on mobile devices. To meet the challenge of information organization, we turn to desktop search. Users often possess multiple devices, and synchronize (subsets of) information between them. This makes file synchronization more important. This thesis presents Dessy, a desktop search and synchronization framework for mobile devices. Dessy uses desktop search techniques, such as indexing, query and index term stemming, and search relevance ranking. Dessy finds files by their content, metadata, and context information. For example, PDF files may be found by their author, subject, title, or text. EXIF data of JPEG files may be used in finding them. User–defined tags can be added to files to organize and retrieve them later. Retrieved files are ranked according to their relevance to the search query. The Dessy prototype uses the BM25 ranking function, used widely in information retrieval. Dessy provides an interface for locating files for both users and applications. Dessy is closely integrated with the Syxaw file synchronizer, which provides efficient file and metadata synchronization, optimizing network usage. Dessy supports synchronization of search results, individual files, and directory trees. It allows finding and synchronizing files that reside on remote computers, or the Internet. Dessy is designed to solve the problem of efficient mobile desktop search and synchronization, also supporting remote and Internet search. Remote searches may be carried out offline using a downloaded index, or while connected to the remote machine on a weak network. To secure user data, transmissions between the Dessy client and server are encrypted using symmetric encryption. Symmetric encryption keys are exchanged with RSA key exchange. Dessy emphasizes extensibility. Also the cryptography can be extended. Users may tag their files with context tags and control custom file metadata. Adding new indexed file types, metadata fields, ranking methods, and index types is easy. Finding files is done with virtual directories, which are views into the user’s files, browseable by regular file managers. On mobile devices, the Dessy GUI provides easy access to the search and synchronization system. This thesis includes results of Dessy synchronization and search experiments, including power usage measurements. Finally, Dessy has been designed with mobility and device constraints in mind. It requires only MIDP 2.0 Mobile Java with FileConnection support, and Java 1.5 on desktop machines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

- Background Palliative medicine and other specialists play significant legal roles in decisions to withhold and withdraw life-sustaining treatment at the end of life. Yet little is known about their knowledge of or attitudes to the law, and the role they think it should play in medical practice. Consideration of doctors’ views is critical to optimizing patient outcomes at the end of life. However, doctors are difficult to engage as participants in empirical research, presenting challenges for researchers seeking to understand doctors’ experiences and perspectives. - Aims To determine how to engage doctors involved in end-of-life care in empirical research about knowledge of the law and the role it plays in medical practice at the end of life. - Methods Postal survey of all specialists in palliative medicine, emergency medicine, geriatric medicine, intensive care, medical oncology, renal medicine, and respiratory medicine in three Australian states: New South Wales, Victoria, and Queensland. The survey was sent in hard copy with two reminders and a follow up reminder letter was also sent to the directors of hospital emergency departments. Awareness was further promoted through engagement with the relevant medical colleges and publications in professional journals; various incentives to respond were also used. The key measure is the response rate of doctors to the survey. - Results Thirty-two percent of doctors in the main study completed their survey with response rate by specialty ranging from 52% (palliative care) to 24% (medical oncology). This overall response rate was twice that of the reweighted pilot study (16%). - Conclusions Doctors remain a difficult cohort to engage in survey research but strategic recruitment efforts can be effective in increasing response rate. Collaboration with doctors and their professional bodies in both the development of the survey instrument and recruitment of participants is essential.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We explore how a standardization effort (i.e., when a firm pursues standards to further innovation) involves different search processes for knowledge and innovation outcomes. Using an inductive case study of Vanke, a leading Chinese property developer, we show how varying degrees of knowledge complexity and codification combine to produce a typology of four types of search process: active, integrative, decentralized and passive, resulting in four types of innovation outcome: modular, radical, incremental and architectural. We argue that when the standardization effort in a firm involves highly codified knowledge, incremental and architectural innovation outcomes are fostered, while modular and radical innovations are hindered. We discuss how standardization efforts can result in a second-order innovation capability, and conclude by calling for comparative research in other settings to understand how standardization efforts can be suited to different types of search process in different industry contexts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Austria and Finland are persistently referred to as the “success stories” of post-1945 European history. Notwithstanding their different points of departure, in the course of the Cold War both countries portrayed themselves as small and neutral border-states in the world dictated by superpower politics. By the 1970s, both countries frequently ranked at the top end in various international classifications regarding economic development and well-being in society. This trend continues today. The study takes under scrutiny the concept of consensus which figures centrally in the two national narratives of post-1945 success. Given that the two domestic contexts as such only share few direct links with one another and are more obviously different than similar in terms of their geographical location, historical experiences and politico-cultural traditions, the analogies and variations in the anatomies of the post-1945 “cultures of consensus” provide an interesting topic for a historical comparative and cross-national examination. The main research question concerns the identification and analysis of the conceptual and procedural convergence points of the concepts of the state and consensus. The thesis is divided into six main chapters. After the introduction, the second chapter presents the theoretical framework in more detail by focusing on the key concepts of the study – the state and consensus. Chapter two also introduces the comparative historical and cross-national research angles. Chapter three grounds the key concepts of the state and consensus in the historical contexts of Austria and Finland by discussing the state, the nation and democracy in a longer term comparative perspective. The fourth and fifth chapter present case studies on the two policy fields, the “pillars”, upon which the post-1945 Austrian and Finnish cultures of consensus are argued to have rested. Chapter four deals with neo-corporatist features in the economic policy making and chapter five discusses the building up of domestic consensus regarding the key concepts of neutrality policies in the 1950s and 1960s. The study concludes that it was not consensus as such but the strikingly intense preoccupation with the theme of domestic consensus that cross-cut, in a curiously analogous manner, the policy-making processes studied. The main challenge for the post-1945 architects of Austrian and Finnish cultures of consensus was to find strategies and concepts for consensus-building which would be compatible with the principles of democracy. Discussed at the level of procedures, the most important finding of the study concerns the triangular mechanism of coordination, consultation and cooperation that set into motion and facilitated a new type of search for consensus in both post-war societies. In this triangle, the agency of the state was central, though in varying ways. The new conceptions concerning a small state’s position in the Cold War world also prompted cross-nationally perceivable willingness to reconsider inherited concepts and procedures of the state and the nation. At the same time, the ways of understanding the role of the state and its relation to society remained profoundly different in Austria and Finland and this basic difference was in many ways reflected in the concepts and procedures deployed in the search for consensus and management of domestic conflicts. For more detailed information, please consult the author.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we describe two well-known mechanisms for sponsored search auction-Generalized Second Price (GSP) and Vickrey-Clarke-Groves (VCG). We then derive a new mechanism for sponsored search auction which we call optimal (OPT) mechanism. The OPT mechanism maximizes the search engine's expected revenue, while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We then undertake a detailed comparative study of the mechanisms GSP, VCG, and OPT. We compute and compare the expected revenue earned by the search engine under the three mechanisms when the advertisers are symmetric and some special conditions are satisfied. We also compare the three mechanisms in terms of incentive compatibility, individual rationality, and computational complexity. Note to Practitioners-The advertiser-supported web site is one of the successful business models in the emerging web landscape. When an Internet user enters a keyword (i.e., a search phrase) into a search engine, the user gets back a page with results, containing the links most relevant to the query and also sponsored links, (also called paid advertisement links). When a sponsored link is clicked, the user is directed to the corresponding advertiser's web page. The advertiser pays the search engine in some appropriate manner for sending the user to its web page. Against every search performed by any user on any keyword, the search engine faces the problem of matching a set of advertisers to the sponsored slots. In addition, the search engine also needs to decide on a price to be charged to each advertiser. Due to increasing demands for Internet advertising space, most search engines currently use auction mechanisms for this purpose. These are called sponsored search auctions. A significant percentage of the revenue of Internet giants such as Google, Yahoo!, MSN, etc., comes from sponsored search auctions. In this paper, we study two auction mechanisms, GSP and VCG, which are quite popular in the sponsored auction context, and pursue the objective of designing a mechanism that is superior to these two mechanisms. In particular, we propose a new mechanism which we call the OPT mechanism. This mechanism maximizes the search engine's expected revenue subject to achieving Bayesian incentive compatibility and individual rationality. Bayesian incentive compatibility guarantees that it is optimal for each advertiser to bid his/her true value provided that all other agents also bid their respective true values. Individual rationality ensures that the agents participate voluntarily in the auction since they are assured of gaining a non-negative payoff by doing so.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For many, particularly in the Anglophone world and Western Europe, it may be obvious that Google has a monopoly over online search and advertising and that this is an undesirable state of affairs, due to Google's ability to mediate information flows online. The baffling question may be why governments and regulators are doing little to nothing about this situation, given the increasingly pivotal importance of the internet and free flowing communications in our lives. However, the law concerning monopolies, namely antitrust or competition law, works in what may be seen as a less intuitive way by the general public. Monopolies themselves are not illegal. Conduct that is unlawful, i.e. abuses of that market power, is defined by a complex set of rules and revolves principally around economic harm suffered due to anticompetitive behavior. However the effect of information monopolies over search, such as Google’s, is more than just economic, yet competition law does not address this. Furthermore, Google’s collection and analysis of user data and its portfolio of related services make it difficult for others to compete. Such a situation may also explain why Google’s established search rivals, Bing and Yahoo, have not managed to provide services that are as effective or popular as Google’s own (on this issue see also the texts by Dirk Lewandowski and Astrid Mager in this reader). Users, however, are not entirely powerless. Google's business model rests, at least partially, on them – especially the data collected about them. If they stop using Google, then Google is nothing.