194 resultados para sicurezza, exploit, XSS, Beef, browser
Resumo:
The literature to date shows that children from poorer households tend to have worse health than their peers, and the gap between them grows with age. We investigate whether and how health shocks (as measured by the onset of chronic conditions) contribute to the income–child health gradient and whether the contemporaneous or cumulative effects of income play important mitigating roles. We exploit a rich panel dataset with three panel waves called the Longitudinal Study of Australian children. Given the availability of three waves of data, we are able to apply a range of econometric techniques (e.g. fixed and random effects) to control for unobserved heterogeneity. The paper makes several contributions to the extant literature. First, it shows that an apparent income gradient becomes relatively attenuated in our dataset when the cumulative and contemporaneous effects of household income are distinguished econometrically. Second, it demonstrates that the income–child health gradient becomes statistically insignificant when controlling for parental health and health-related behaviours or unobserved heterogeneity.
Resumo:
So far, low probability differentials for the key schedule of block ciphers have been used as a straightforward proof of security against related-key differential analysis. To achieve resistance, it is believed that for cipher with k-bit key it suffices the upper bound on the probability to be 2− k . Surprisingly, we show that this reasonable assumption is incorrect, and the probability should be (much) lower than 2− k . Our counter example is a related-key differential analysis of the well established block cipher CLEFIA-128. We show that although the key schedule of CLEFIA-128 prevents differentials with a probability higher than 2− 128, the linear part of the key schedule that produces the round keys, and the Feistel structure of the cipher, allow to exploit particularly chosen differentials with a probability as low as 2− 128. CLEFIA-128 has 214 such differentials, which translate to 214 pairs of weak keys. The probability of each differential is too low, but the weak keys have a special structure which allows with a divide-and-conquer approach to gain an advantage of 27 over generic analysis. We exploit the advantage and give a membership test for the weak-key class and provide analysis of the hashing modes. The proposed analysis has been tested with computer experiments on small-scale variants of CLEFIA-128. Our results do not threaten the practical use of CLEFIA.
Resumo:
Water to air methane emissions from freshwater reservoirs can be dominated by sediment bubbling (ebullitive) events. Previous work to quantify methane bubbling from a number of Australian sub-tropical reservoirs has shown that this can contribute as much as 95% of total emissions. These bubbling events are controlled by a variety of different factors including water depth, surface and internal waves, wind seiching, atmospheric pressure changes and water levels changes. Key to quantifying the magnitude of this emission pathway is estimating both the bubbling rate as well as the areal extent of bubbling. Both bubbling rate and areal extent are seldom constant and require persistent monitoring over extended time periods before true estimates can be generated. In this paper we present a novel system for persistent monitoring of both bubbling rate and areal extent using multiple robotic surface chambers and adaptive sampling (grazing) algorithms to automate the quantification process. Individual chambers are self-propelled and guided and communicate between each other without the need for supervised control. They can maintain station at a sampling site for a desired incubation period and continuously monitor, record and report fluxes during the incubation. To exploit the methane sensor detection capabilities, the chamber can be automatically lowered to decrease the head-space and increase concentration. The grazing algorithms assign a hierarchical order to chambers within a preselected zone. Chambers then converge on the individual recording the highest 15 minute bubbling rate. Individuals maintain a specified distance apart from each other during each sampling period before all individuals are then required to move to different locations based on a sampling algorithm (systematic or adaptive) exploiting prior measurements. This system has been field tested on a large-scale subtropical reservoir, Little Nerang Dam, and over monthly timescales. Using this technique, localised bubbling zones on the water storage were found to produce over 50,000 mg m-2 d-1 and the areal extent ranged from 1.8 to 7% of the total reservoir area. The drivers behind these changes as well as lessons learnt from the system implementation are presented. This system exploits relatively cheap materials, sensing and computing and can be applied to a wide variety of aquatic and terrestrial systems.
Resumo:
We aim to design strategies for sequential decision making that adjust to the difficulty of the learning problem. We study this question both in the setting of prediction with expert advice, and for more general combinatorial decision tasks. We are not satisfied with just guaranteeing minimax regret rates, but we want our algorithms to perform significantly better on easy data. Two popular ways to formalize such adaptivity are second-order regret bounds and quantile bounds. The underlying notions of 'easy data', which may be paraphrased as "the learning problem has small variance" and "multiple decisions are useful", are synergetic. But even though there are sophisticated algorithms that exploit one of the two, no existing algorithm is able to adapt to both. In this paper we outline a new method for obtaining such adaptive algorithms, based on a potential function that aggregates a range of learning rates (which are essential tuning parameters). By choosing the right prior we construct efficient algorithms and show that they reap both benefits by proving the first bounds that are both second-order and incorporate quantiles.
Resumo:
While organizations strive to leverage the vast information generated daily from social media platforms, and decision makers are keen to identify and exploit its value, the quality of this information remains uncertain. Past research on information quality criteria and evaluation issues in social media is largely disparate, incomparable and lacking any common theoretical basis. In attention to this gap, this study adapts existing guidelines and exemplars of construct conceptualization in information systems research, to deductively define information quality and related criteria in the social media context. Building on a notion of information derived from semiotic theory, this paper suggests a general conceptualization of information quality in the social media context that can be used in future research to develop more context specific conceptual models.
Resumo:
Existing techniques for automated discovery of process models from event logs gen- erally produce flat process models. Thus, they fail to exploit the notion of subprocess as well as error handling and repetition constructs provided by contemporary process modeling notations, such as the Business Process Model and Notation (BPMN). This paper presents a technique for automated discovery of hierarchical BPMN models con- taining interrupting and non-interrupting boundary events and activity markers. The technique employs functional and inclusion dependency discovery techniques in order to elicit a process-subprocess hierarchy from the event log. Given this hierarchy and the projected logs associated to each node in the hierarchy, parent process and subprocess models are then discovered using existing techniques for flat process model discovery. Finally, the resulting models and logs are heuristically analyzed in order to identify boundary events and markers. By employing approximate dependency discovery tech- niques, it is possible to filter out noise in the event log arising for example from data entry errors or missing events. A validation with one synthetic and two real-life logs shows that process models derived by the proposed technique are more accurate and less complex than those derived with flat process discovery techniques. Meanwhile, a validation on a family of synthetically generated logs shows that the technique is resilient to varying levels of noise.
Resumo:
We exploit a voting reform in France to estimate the causal effect of exit poll information on turnout and bandwagon voting. Before the change in legislation, individuals in some French overseas territories voted after the election result had already been made public via exit poll information from mainland France. We estimate that knowing the exit poll information decreases voter turnout by about 11 percentage points. Our study is the first clean empirical design outside of the laboratory to demonstrate the effect of such knowledge on voter turnout. Furthermore, we find that exit poll information significantly increases bandwagon voting; that is, voters who choose to turn out are more likely to vote for the expected winner.
Resumo:
Meat/meat alternatives (M/MA) are key sources of Fe, Zn and protein, but intake tends to be low in young children. Australian recommendations state that Fe-rich foods, including M/MA, should be the first complementary foods offered to infants. The present paper reports M/MA consumption of Australian infants and toddlers, compares intake with guidelines, and suggests strategies to enhance adherence to those guidelines. Mother–infant dyads recruited as part of the NOURISH and South Australian Infants Dietary Intake studies provided 3 d of intake data at three time points: Time 1 (T1) (n 482, mean age 5·5 (SD 1·1) months), Time 2 (T2) (n 600, mean age 14·0 (SD 1·2) months) and Time 3 (T3) (n 533, mean age 24 (SD 0·7) months). Of 170 infants consuming solids and aged greater than 6 months at T1, 50 (29 %) consumed beef, lamb, veal (BLV) or pork on at least one of 3 d. Commercial infant foods containing BLV or poultry were the most common form of M/MA consumed at T1, whilst by T2 BLV mixed dishes (including pasta bolognaise) became more popular and remained so at T3. The processed M/MA increased in popularity over time, led by pork (including ham). The present study shows that M/MA are not being eaten by Australian infants or toddlers regularly enough; or in adequate quantities to meet recommendations; and that the form in which these foods are eaten can lead to smaller M/MA serve sizes and greater Na intake. Parents should be encouraged to offer M/MA in a recognisable form, as one of the first complementary foods, in order to increase acceptance at a later age.
Resumo:
Typing 2 or 3 keywords into a browser has become an easy and efficient way to find information. Yet, typing even short queries becomes tedious on ever shrinking (virtual) keyboards. Meanwhile, speech processing is maturing rapidly, facilitating everyday language input. Also, wearable technology can inform users proactively by listening in on their conversations or processing their social media interactions. Given these developments, everyday language may soon become the new input of choice. We present an information retrieval (IR) algorithm specifically designed to accept everyday language. It integrates two paradigms of information retrieval, previously studied in isolation; one directed mainly at the surface structure of language, the other primarily at the underlying meaning. The integration was achieved by a Markov machine that encodes meaning by its transition graph, and surface structure by the language it generates. A rigorous evaluation of the approach showed, first, that it can compete with the quality of existing language models, second, that it is more effective the more verbose the input, and third, as a consequence, that it is promising for an imminent transition from keyword input, where the onus is on the user to formulate concise queries, to a modality where users can express more freely, more informal, and more natural their need for information in everyday language.
Resumo:
This paper presents visual detection and classification of light vehicles and personnel on a mine site.We capitalise on the rapid advances of ConvNet based object recognition but highlight that a naive black box approach results in a significant number of false positives. In particular, the lack of domain specific training data and the unique landscape in a mine site causes a high rate of errors. We exploit the abundance of background-only images to train a k-means classifier to complement the ConvNet. Furthermore, localisation of objects of interest and a reduction in computation is enabled through region proposals. Our system is tested on over 10km of real mine site data and we were able to detect both light vehicles and personnel. We show that the introduction of our background model can reduce the false positive rate by an order of magnitude.
Resumo:
This proposal describes the innovative and competitive lunar payload solution developed at the Queensland University of Technology (QUT)–the LunaRoo: a hopping robot designed to exploit the Moon's lower gravity to leap up to 20m above the surface. It is compact enough to fit within a 10cm cube, whilst providing unique observation and mission capabilities by creating imagery during the hop. This first section is deliberately kept short and concise for web submission; additional information can be found in the second chapter.
Resumo:
Purpose This study aims to gain a clearer understanding of digital channel design. The emergence of new technologies has revolutionised the way companies interact and engage with customers. The driver for this research was the suggestion that practitioners feel they do not possess the skills to understand and exploit new digital channel opportunities. To gain a clearer understanding of digital channel design, this paper addresses the research question: What digital channels do companies from a wide range of industries and sectors use? Design/methodology/approach A content analysis of 100 international companies was conducted with multiple data sources to form a typology of digital “touchpoints”. The appropriateness of a digital channel typology for this study was for developing rigorous and useful concepts for clarifying and refining the meaning of digital channels. Findings This study identifies what digital channels companies globally currently employ and explores the related needs across industries. A total of 34 digital touchpoints and 4 typologies of digital channels were identified across 16 industries. This research helps to identify the relationship between digital channels and enabling the connections with industry. Research limitations/implications The findings contribute to the growing research area of digital channels. The typology of digital channels is a useful starting point for developing a systematic, theory-based study for enabling the development of broader, comprehensive theories of digital channels. Practical implications Typologies and touchpoints are outlined in relation to industry, company objectives and customer needs to allow businesses to seize opportunities and optimise performance through individual touchpoints. A digital channel model as a key outcome of this research guides practitioners on what touchpoint to implement through an interrelated understanding of industry, company and customer needs. Originality/value This is the first paper to explore a range of industries in relation to their use of digital channels using a unique content analysis. Contributions include clarifying and refining digital channel meaning; identifying and refining the hierarchical relations among digital channels(typologies); and establishing typology and industry relationship model.
Resumo:
The unsustainable and exploitative use of one of the most important but scarce resources on the planet - freshwater - continues to create conflict and human dislocation on a grand scale. Instead of witnessing nation-states adopting more equitable and efficient conservation strategies, powerful corporations are permitted to privatise and monopolise diminishing water reservoirs based on flawed neo-liberal assumptions and market models of the ‘global good’. The commodification of water has enabled corporate monopolies and corrupt states to exploit a fundamental human right, and in the process have created new forms of criminality. In recent years, affluent industrialised nations have experienced violent rioting as protestors express opposition to government ‘freshwater taxes’ and to corporate investors seeking to privatise drinking water. These water conflicts have included unprecedented clashes with police and deaths of innocent civilians in South Africa (BBC News, 2014a); the United Nations intervention in Detroit USA after weeks of public protest (Burns, 2014); and the hundreds of thousands of people protesting in Ireland (BBC News, 2014,b; Irish Times 2015). Subsequently, the commodification of freshwater has become a criminological issue for water-abundant rich states, as well as for the highly indebted water-scarce nations.
Resumo:
This article considers the recent international controversy over the patents held by a Melbourne firm, Genetic Technologies Limited (GTG), in respect of non-coding DNA and genomic mapping. It explores the ramifications of the GTG dispute in terms of licensing, litigation, and policy reform, and—as a result of this dispute—the perceived conflict between law and science. GTG has embarked upon an ambitious licensing program with twenty seven commercial licensees and five research licensees. Most significantly, GTG has obtained an exclusive licence from Myriad Genetics to use and exploit its medical diagnostics in Australia, New Zealand, and the Asia-Pacific region. In the US, GTG brought a legal action for patent infringement against the Applera Corporation and its subsidiaries. In response, Applera counterclaimed that the patents of GTG were invalid because they failed to comply with the requirements of US patent law, such as novelty, inventive step, and written specifications. In New Zealand, the Auckland District Health Board brought legal action in the High Court, seeking a declaration that the patents of GTG were invalid, and that, in any case, the Board has not infringed them. The New Zealand Ministry of Health and the Ministry of Economic Development have reported to Cabinet on the issues relating to the patenting of genetic material. Similarly, the Australian Law Reform Commission (ALRC) has also engaged in an inquiry into gene patents and human health; and the Advisory Council on Intellectual Property (ACIP) has considered whether there should be a new defence in respect of experimental use and research.
Resumo:
In recent years, both developing and industrialised societies have experienced riots and civil unrest over the corporate exploitation of fresh water. Water conflicts increase as water scarcity rises and the unsustainable use of fresh water will continue to have profound implications for sustainable development and the realisation of human rights. Rather than states adopting more costly water conservation strategies or implementing efficient water technologies, corporations are exploiting natural resources in what has been described as the “privatization of water”. By using legal doctrines, states and corporations construct fresh water sources as something that can be owned or leased. For some regions, the privatization of water has enabled corporations and corrupt states to exploit a fundamental human right. Arguing that such matters are of relevance to criminology, which should be concerned with fundamental environmental and human rights, this article adopts a green criminological perspective and draws upon Treadmill of Production theory.