999 resultados para Information leak
Resumo:
This paper presents a Chance-constraint Programming approach for constructing maximum-margin classifiers which are robust to interval-valued uncertainty in training examples. The methodology ensures that uncertain examples are classified correctly with high probability by employing chance-constraints. The main contribution of the paper is to pose the resultant optimization problem as a Second Order Cone Program by using large deviation inequalities, due to Bernstein. Apart from support and mean of the uncertain examples these Bernstein based relaxations make no further assumptions on the underlying uncertainty. Classifiers built using the proposed approach are less conservative, yield higher margins and hence are expected to generalize better than existing methods. Experimental results on synthetic and real-world datasets show that the proposed classifiers are better equipped to handle interval-valued uncertainty than state-of-the-art.
Resumo:
The paper presents initial findings from an Austroads funded project NT1782 Ability to Absorb Information through Electronic and Static Signs. The paper aims to investigate how easily messages displayed on co-located signs can be absorbed, and if drivers can absorb messages and take appropriate action without any adverse impact on the safety and efficiency of driving. Co-location of three types of signs under motorway conditions was investigated: direction signs (DS), variable message signs (VMS) and variable speed limits/lane control signs (VSL/LCS). The authors reviewed global wide practices and research evidence on different types of sign co-locations. It was found that dual co-location of VSL/LCS, VMS and/or DS is a practical arrangement which has been widely practised overseas and in Australia. Triple co-location of VSL/LCS, VMS and DS is also practised overseas but is still new to the Australian driving community. The NT1782 project also employed an advanced driving simulator (ADS) to further investigate the possible impacts of sign co-location on drivers’ responses in an emergency situation and there were no obviously adverse impacts have been identified from the ADS study. The authors consolidated all findings and concluded that although there is no clear evidence showing that triple co-location gives rise to riskier behaviour, this proposition should be viewed with caution. Further evaluation of triple co-location in a real-life setting is called for.
Resumo:
Digital Image
Resumo:
The longevity of seed in the soil is a key determinant of the cost and length of weed eradication programs. Soil seed bank information and ongoing research have input into the planning and reporting of two nationally cost shared weed eradication programs based in tropical north Queensland. These eradication programs are targeting serious weeds such as Chromoleana odorata, Mikania micrantha, Miconia calvescens, Clidemia hirta and Limnocharis flava. Various methods are available for estimating soil seed persistence. Field methods to estimate the total and germinable soil seed densities include seed packet burial trials, extracting seed from field soil samples, germinating seed in field soil samples and observations from native range seed bank studies. Interrogating field control records can also indicate the length of the control and monitoring periods needed to exhaust the seed bank. Recently, laboratory tests which rapidly age seed have provided an additional indicator of relative seed persistence. Each method has its advantages, drawbacks and logistical constraints.
Resumo:
We carried out a discriminant analysis with identity by descent (IBD) at each marker as inputs, and the sib pair type (affected-affected versus affected-unaffected) as the output. Using simple logistic regression for this discriminant analysis, we illustrate the importance of comparing models with different number of parameters. Such model comparisons are best carried out using either the Akaike information criterion (AIC) or the Bayesian information criterion (BIC). When AIC (or BIC) stepwise variable selection was applied to the German Asthma data set, a group of markers were selected which provide the best fit to the data (assuming an additive effect). Interestingly, these 25-26 markers were not identical to those with the highest (in magnitude) single-locus lod scores.
Resumo:
The safety of food has become an increasingly interesting issue to consumers and the media. It has also become a source of concern, as the amount of information on the risks related to food safety continues to expand. Today, risk and safety are permanent elements within the concept of food quality. Safety, in particular, is the attribute that consumers find very difficult to assess. The literature in this study consists of three main themes: traceability; consumer behaviour related to both quality and safety issues and perception of risk; and valuation methods. The empirical scope of the study was restricted to beef, because the beef labelling system enables reliable tracing of the origin of beef, as well as attributes related to safety, environmental friendliness and animal welfare. The purpose of this study was to examine what kind of information flows are required to ensure quality and safety in the food chain for beef, and who should produce that information. Studying the willingness to pay of consumers makes it possible to determine whether the consumers consider the quantity of information available on the safety and quality of beef sufficient. One of the main findings of this study was that the majority of Finnish consumers (73%) regard increased quality information as beneficial. These benefits were assessed using the contingent valuation method. The results showed that those who were willing to pay for increased information on the quality and safety of beef would accept an average price increase of 24% per kilogram. The results showed that certain risk factors impact consumer willingness to pay. If the respondents considered genetic modification of food or foodborne zoonotic diseases as harmful or extremely harmful risk factors in food, they were more likely to be willing to pay for quality information. The results produced by the models thus confirmed the premise that certain food-related risks affect willingness to pay for beef quality information. The results also showed that safety-related quality cues are significant to the consumers. In the first place, the consumers would like to receive information on the control of zoonotic diseases that are contagious to humans. Similarly, other process-control related information ranked high among the top responses. Information on any potential genetic modification was also considered important, even though genetic modification was not regarded as a high risk factor.
Resumo:
Information sharing in distance collaboration: A software engineering perspective, QueenslandFactors in software engineering workgroups such as geographical dispersion and background discipline can be conceptually characterized as "distances", and they are obstructive to team collaboration and information sharing. This thesis focuses on information sharing across multidimensional distances and develops an information sharing distance model, with six core dimensions: geography, time zone, organization, multi-discipline, heterogeneous roles, and varying project tenure. The research suggests that the effectiveness of workgroups may be improved through mindful conducts of information sharing, especially proactive consideration of, and explicit adjustment for, the distances of the recipient when sharing information.
Resumo:
Students in higher education typically learn to use information as part of their course of study, which is intended to support ongoing academic, personal and professional growth. Informing the development of effective information literacy education, this research uses a phenomenographic approach to investigate the experiences of a teacher and students engaged in lessons focused on exploring language and gender topics by tracing and analyzing their evolution through scholarly discourse. The findings suggest that the way learners use information influences content-focused learning outcomes, and reveal how teachers may enact lessons that enable students to learn to use information in ways that foster a specific understanding of the topic they are investigating.
Resumo:
The present paper suggests articulating the general context of workplace in information literacy research. The paper considers distinguishing between information literacy research in workplaces and professions. Referring to the results of a phenomenographic enquiry into web professionals’ information literacy as an example, it is indicated that work-related information literacy in particular contexts and depending on the nature of the context, is experienced beyond physical workspaces and at professional level. This involves people interacting with each other and with information at a broader level in comparison to a physically bounded workspace. Regarding the example case discussed in the paper, virtuality is identified as the dominant feature of the profession that causes information literacy to be experienced at a professional level. It is anticipated that pursuing the direction proposed in the paper will result in a more segmented image of work-related information literacy.
Resumo:
Based on unique news data relating to gold and crude oil, we investigate how news volume and sentiment, shocks in trading activity, market depth and trader positions unrelated to information flow covary with realized volatility. Positive shocks to the rate of news arrival, and negative shocks to news sentiment exhibit the largest effects. After controlling for the level of news flow and cross-correlations, net trader positions play only a minor role. These findings are at odds with those of [Wang (2002a). The Journal of Futures Markets, 22, 427–450; Wang (2002b). The Financial Review, 37, 295–316], but are consistent with the previous literature which doesn't find a strong link between volatility and trader positions.
Resumo:
Optimizing the quality of early childhood education (ECE) is an international policy priority. Teacher-child interactions have been identified as the strongest indicator of quality and most potent predictor of child outcomes. This paper presents ethnomethodological and conversation analysis of an interaction between an early childhood educator with two children as they engage with each other, while performing a Web search. Analyses shows that question design can elicit qualitatively different responses with regard to sustained interactions. Understanding the design of teacher questions has pedagogic implications for the work of the teacher and for the broader quality agenda in early childhood education.
Resumo:
Ecosystem based management requires the integration of various types of assessment indicators. Understanding stakeholders' information preferences is important, in selecting those indicators that best support management and policy. Both the preferences of decision-makers and the general public may matter, in democratic participatory management institutions. This paper presents a multi-criteria analysis aimed at quantifying the relative importance to these groups of economic, ecological and socio-economic indicators usually considered when managing ecosystem services in a coastal development context. The Analytic Hierarchy Process (AHP) is applied within two nationwide surveys in Australia, and preferences of both the general public and decision-makers for these indicators are elicited and compared. Results show that, on average across both groups, the priority in assessing a generic coastal development project is for the ecological assessment of its impacts on marine biodiversity. Ecological assessment indicators are globally preferred to both economic and socio-economic indicators regardless of the nature of the impacts studied. These results are observed for a significantly larger proportion of decision-maker than general public respondents, questioning the extent to which the general public's preferences are well reflected in decision-making processes.
Resumo:
The purpose of this study is to describe the development of application of mass spectrometry for the structural analyses of non-coding ribonucleic acids during past decade. Mass spectrometric methods are compared of traditional gel electrophoretic methods, the characteristics of performance of mass spectrometric, analyses are studied and the future trends of mass spectrometry of ribonucleic acids are discussed. Non-coding ribonucleic acids are short polymeric biomolecules which are not translated to proteins, but which may affect the gene expression in all organisms. Regulatory ribonucleic acids act through transient interactions with key molecules in signal transduction pathways. Interactions are mediated through specific secondary and tertiary structures. Posttranscriptional modifications in the structures of molecules may introduce new properties to the organism, such as adaptation to environmental changes or development of resistance to antibiotics. In the scope of this study, the structural studies include i) determination of the sequence of nucleobases in the polymer chain, ii) characterisation and localisation of posttranscriptional modifications in nucleobases and in the backbone structure, iii) identification of ribonucleic acid-binding molecules and iv) probing of higher order structures in the ribonucleic acid molecule. Bacteria, archaea, viruses and HeLa cancer cells have been used as target organisms. Synthesised ribonucleic acids consisting of structural regions of interest have been frequently used. Electrospray ionisation (ESI) and matrix-assisted laser desorption ionisation (MALDI) have been used for ionisation of ribonucleic analytes. Ammonium acetate and 2-propanol are common solvents for ESI. Trihydroxyacetophenone is the optimal MALDI matrix for ionisation of ribonucleic acids and peptides. Ammonium salts are used in ESI buffers and MALDI matrices as additives to remove cation adducts. Reverse phase high performance liquid chromatography has been used for desalting and fractionation of analytes either off-line of on-line, coupled with ESI source. Triethylamine and triethylammonium bicarbonate are used as ion pair reagents almost exclusively. Fourier transform ion cyclotron resonance analyser using ESI coupled with liquid chromatography is the platform of choice for all forms of structural analyses. Time-of-flight (TOF) analyser using MALDI may offer sensitive, easy-to-use and economical solution for simple sequencing of longer oligonucleotides and analyses of analyte mixtures without prior fractionation. Special analysis software is used for computer-aided interpretation of mass spectra. With mass spectrometry, sequences of 20-30 nucleotides of length may be determined unambiguously. Sequencing may be applied to quality control of short synthetic oligomers for analytical purposes. Sequencing in conjunction with other structural studies enables accurate localisation and characterisation of posttranscriptional modifications and identification of nucleobases and amino acids at the sites of interaction. High throughput screening methods for RNA-binding ligands have been developed. Probing of the higher order structures has provided supportive data for computer-generated three dimensional models of viral pseudoknots. In conclusion. mass spectrometric methods are well suited for structural analyses of small species of ribonucleic acids, such as short non-coding ribonucleic acids in the molecular size region of 20-30 nucleotides. Structural information not attainable with other methods of analyses, such as nuclear magnetic resonance and X-ray crystallography, may be obtained with the use of mass spectrometry. Sequencing may be applied to quality control of short synthetic oligomers for analytical purposes. Ligand screening may be used in the search of possible new therapeutic agents. Demanding assay design and challenging interpretation of data requires multidisclipinary knowledge. The implement of mass spectrometry to structural studies of ribonucleic acids is probably most efficiently conducted in specialist groups consisting of researchers from various fields of science.