577 resultados para chosen-plaintext attack


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Survey-based health research is in a boom phase following an increased amount of health spending in OECD countries and the interest in ageing. A general characteristic of survey-based health research is its diversity. Different studies are based on different health questions in different datasets; they use different statistical techniques; they differ in whether they approach health from an ordinal or cardinal perspective; and they differ in whether they measure short-term or long-term effects. The question in this paper is simple: do these differences matter for the findings? We investigate the effects of life-style choices (drinking, smoking, exercise) and income on six measures of health in the US Health and Retirement Study (HRS) between 1992 and 2002: (1) self-assessed general health status, (2) problems with undertaking daily tasks and chores, (3) mental health indicators, (4) BMI, (5) the presence of serious long-term health conditions, and (6) mortality. We compare ordinal models with cardinal models; we compare models with fixed effects to models without fixed-effects; and we compare short-term effects to long-term effects. We find considerable variation in the impact of different determinants on our chosen health outcome measures; we find that it matters whether ordinality or cardinality is assumed; we find substantial differences between estimates that account for fixed effects versus those that do not; and we find that short-run and long-run effects differ greatly. All this implies that health is an even more complicated notion than hitherto thought, defying generalizations from one measure to the others or one methodology to another.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

X-ray computed tomography (CT) is a medical imaging technique that produces images of trans-axial planes through the human body. When compared with a conventional radiograph, which is an image of many planes superimposed on each other, a CT image exhibits significantly improved contrast although this is at the expense of reduced spatial resolution.----- A CT image is reconstructed mathematically from a large number of one dimensional projections of the chosen plane. These projections are acquired electronically using a linear array of solid-state detectors and an x ray source that rotates around the patient.----- X-ray computed tomography is used routinely in radiological examinations. It has also be found to be useful in special applications such as radiotherapy treatment planning and three-dimensional imaging for surgical planning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelling of interferometric signals related to tear film surface quality is considered. In the context of tear film surface quality estimation in normal healthy eyes, two clinical parameters are of interest: the build-up time, and the average interblink surface quality. The former is closely related to the signal derivative while the latter to the signal itself. Polynomial signal models, chosen for a particular set of noisy interferometric measurements, can be optimally selected, in some sense, with a range of information criteria such as AIC, MDL, Cp, and CME. Those criteria, however, do not always guarantee that the true derivative of the signal is accurately represented and they often overestimate it. Here, a practical method for judicious selection of model order in a polynomial fitting to a signal is proposed so that the derivative of the signal is adequately represented. The paper highlights the importance of context-based signal modelling in model order selection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Problem-based learning (PBL) is a pedagogical methodology that presents the learner with a problem to be solved to stimulate and situate learning. This paper presents key characteristics of a problem-based learning environment that determines its suitability as a data source for workrelated research studies. To date, little has been written about the availability and validity of PBL environments as a data source and its suitability for work-related research. We describe problembased learning and use a research project case study to illustrate the challenges associated with industry work samples. We then describe the PBL course used in our research case study and use this example to illustrate the key attributes of problem-based learning environments and show how the chosen PBL environment met the work-related research requirements of the research case study. We propose that the more realistic the PBL work context and work group composition, the better the PBL environment as a data source for a work-related research. The work context is more realistic when relevant and complex project-based problems are tackled in industry-like work conditions over longer time frames. Work group composition is more realistic when participants with industry-level education and experience enact specialized roles in different disciplines within a professional community.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We all know that the future of news is digital. But mainstream news providers are still grappling with how to entice more customers to their online sites. This paper provides context for a survey currently underway on user intentions towards online news and entertainment, by exploring: 1. Consumer behaviours and intentions with regards to accessing online news and information; 2. Current trends in the Australian online news and information sector; and 3. Key issues and emerging opportunities in the Australian (and global) environment. Key influences on use of online news and information are pricing and access. The paper highlights emerging technical opportunities and flags service gaps. These gaps include multiple disconnects between: 1. Changing user intentions towards online and location based news (news based on a specific locality as chosen by the user) and information; 2. The ability by consumers to act on these intentions via the availability and cost of technologies; 3. Younger users may prefer entertainment to news, or ‘infotainment’; and 4. Current online offerings of traditional news providers and opportunities. These disconnects present an opportunity for online news suppliers to appraise and resolve. Doing so may enhance their online news and information offering, attract consumers and improve loyalty. Outcomes from this paper will be used to identify knowledge gaps and contribute to the development of further analysis on Australian consumers and their behaviours and intentions towards online news and information. This will be undertaken via focus groups as part of a broader study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study reported on the issues surrounding the acquisition of problem-solving competence of middle-year students who had been ascertained as above average in intelligence, but underachieving in problem-solving competence. In particular, it looked at the possible links between problem-posing skills development and improvements in problem-solving competence. A cohort of Year 7 students at a private, non-denominational, co-educational school was chosen as participants for the study, as they undertook a series of problem-posing sessions each week throughout a school term. The lessons were facilitated by the researcher in the students’ school setting. Two criteria were chosen to identify participants for this study. Firstly, each participant scored above the 60th percentile in the standardized Middle Years Ability Test (MYAT) (Australian Council for Educational Research, 2005) and secondly, the participants all scored below the cohort average for Criterion B (Problem-solving Criterion) in their school mathematics tests during the first semester of Year 7. Two mutually exclusive groups of participants were investigated with one constituting the Comparison Group and the other constituting the Intervention Group. The Comparison Group was chosen from a Year 7 cohort for whom no problem-posing intervention had occurred, while the Intervention Group was chosen from the Year 7 cohort of the following year. This second group received the problem-posing intervention in the form of a teaching experiment. That is, the Comparison Group were only pre-tested and post-tested, while the Intervention Group was involved in the teaching experiment and received the pre-testing and post-testing at the same time of the year, but in the following year, when the Comparison Group have moved on to the secondary part of the school. The groups were chosen from consecutive Year 7 cohorts to avoid cross-contamination of the data. A constructionist framework was adopted for this study that allowed the researcher to gain an “authentic understanding” of the changes that occurred in the development of problem-solving competence of the participants in the context of a classroom setting (Richardson, 1999). Qualitative and quantitative data were collected through a combination of methods including researcher observation and journal writing, video taping, student workbooks, informal student interviews, student surveys, and pre-testing and post-testing. This combination of methods was required to increase the validity of the study’s findings through triangulation of the data. The study findings showed that participation in problem-posing activities can facilitate the re-engagement of disengaged, middle-year mathematics students. In addition, participation in these activities can result in improved problem-solving competence and associated developmental learning changes. Some of the changes that were evident as a result of this study included improvements in self-regulation, increased integration of prior knowledge with new knowledge and increased and contextualised socialisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main objective of this PhD was to further develop Bayesian spatio-temporal models (specifically the Conditional Autoregressive (CAR) class of models), for the analysis of sparse disease outcomes such as birth defects. The motivation for the thesis arose from problems encountered when analyzing a large birth defect registry in New South Wales. The specific components and related research objectives of the thesis were developed from gaps in the literature on current formulations of the CAR model, and health service planning requirements. Data from a large probabilistically-linked database from 1990 to 2004, consisting of fields from two separate registries: the Birth Defect Registry (BDR) and Midwives Data Collection (MDC) were used in the analyses in this thesis. The main objective was split into smaller goals. The first goal was to determine how the specification of the neighbourhood weight matrix will affect the smoothing properties of the CAR model, and this is the focus of chapter 6. Secondly, I hoped to evaluate the usefulness of incorporating a zero-inflated Poisson (ZIP) component as well as a shared-component model in terms of modeling a sparse outcome, and this is carried out in chapter 7. The third goal was to identify optimal sampling and sample size schemes designed to select individual level data for a hybrid ecological spatial model, and this is done in chapter 8. Finally, I wanted to put together the earlier improvements to the CAR model, and along with demographic projections, provide forecasts for birth defects at the SLA level. Chapter 9 describes how this is done. For the first objective, I examined a series of neighbourhood weight matrices, and showed how smoothing the relative risk estimates according to similarity by an important covariate (i.e. maternal age) helped improve the model’s ability to recover the underlying risk, as compared to the traditional adjacency (specifically the Queen) method of applying weights. Next, to address the sparseness and excess zeros commonly encountered in the analysis of rare outcomes such as birth defects, I compared a few models, including an extension of the usual Poisson model to encompass excess zeros in the data. This was achieved via a mixture model, which also encompassed the shared component model to improve on the estimation of sparse counts through borrowing strength across a shared component (e.g. latent risk factor/s) with the referent outcome (caesarean section was used in this example). Using the Deviance Information Criteria (DIC), I showed how the proposed model performed better than the usual models, but only when both outcomes shared a strong spatial correlation. The next objective involved identifying the optimal sampling and sample size strategy for incorporating individual-level data with areal covariates in a hybrid study design. I performed extensive simulation studies, evaluating thirteen different sampling schemes along with variations in sample size. This was done in the context of an ecological regression model that incorporated spatial correlation in the outcomes, as well as accommodating both individual and areal measures of covariates. Using the Average Mean Squared Error (AMSE), I showed how a simple random sample of 20% of the SLAs, followed by selecting all cases in the SLAs chosen, along with an equal number of controls, provided the lowest AMSE. The final objective involved combining the improved spatio-temporal CAR model with population (i.e. women) forecasts, to provide 30-year annual estimates of birth defects at the Statistical Local Area (SLA) level in New South Wales, Australia. The projections were illustrated using sixteen different SLAs, representing the various areal measures of socio-economic status and remoteness. A sensitivity analysis of the assumptions used in the projection was also undertaken. By the end of the thesis, I will show how challenges in the spatial analysis of rare diseases such as birth defects can be addressed, by specifically formulating the neighbourhood weight matrix to smooth according to a key covariate (i.e. maternal age), incorporating a ZIP component to model excess zeros in outcomes and borrowing strength from a referent outcome (i.e. caesarean counts). An efficient strategy to sample individual-level data and sample size considerations for rare disease will also be presented. Finally, projections in birth defect categories at the SLA level will be made.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Typical high strength steels (HSS) have exceptional high strengths with improved weldability making the material attractive in modern steel constructions. However, due to lack of understanding, most of the current steel design standards are limited to conventional low strength steels (LSS, i.e. fy ≤ 450 MPa). This paper presents the details of full-scale experimental tests on short beams fabricated from BISPLATE80 HSS materials (nominal fy = 690 MPa). The various slenderness ratios of the plate elements in the test specimens were chosen in the range near the current yield limit (AS4100-1998, etc.). The experimental studies presented in this paper have produced a better understanding of the structural behaviour of HSS members subjected to local instabilities. Comparisons have also presented in the paper regarding to the design predictions from the current steel standards (AS4100-1998). This study has enabled to provide a series of proposals for proper assessment of plate slenderness limits for structural members made of representative HSS materials. This research work also enables the inclusion of further versions in the steel design specifications for typical HSS materials to be used in buildings and bridges. This paper also presents a distribution model of residual stresses in the longitudinal direction for typical HSS I-sections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The literature identifies several models that describe inter-phase mass transfer, key to the emission process. While the emission process is complex and these models may be more or less successful at predicting mass transfer rates, they identify three key variables for a system involving a liquid and an air phase in contact with it: • A concentration (or partial pressure) gradient driving force; • The fluid dynamic characteristics within the liquid and air phases, and • The chemical properties of the individual components within the system. In three applied research projects conducted prior to this study, samples collected with two well-known sampling devices resulted in very different odour emission rates. It was not possible to adequately explain the differences observed. It appeared likely, however, that the sample collection device might have artefact effects on the emission of odorants, i.e. the sampling device appeared to have altered the mass transfer process. This raised the obvious question: Where two different emission rates are reported for a single source (differing only in the selection of sampling device), and a credible explanation for the difference in emission rate cannot be provided, which emission rate is correct? This research project aimed to identify the factors that determine odour emission rates, the impact that the characteristics of a sampling device may exert on the key mass transfer variables, and ultimately, the impact of the sampling device on the emission rate itself. To meet these objectives, a series of targeted reviews, and laboratory and field investigations, were conducted. Two widely-used, representative devices were chosen to investigate the influence of various parameters on the emission process. These investigations provided insight into the odour emission process generally, and the influence of the sampling device specifically.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The author's approach to the problems associated with building in bushfire prone landscapes comes from 12 years of study of the biophysical and cultural landscapes in the Great Southern Region of Western Australia - research which resulted in the design and construction of the H-house at Bremer Bay. The house was developed using a 'ground up' approach whereby Dr Weir conducted topographical surveys and worked with a local botanist and a bushfire risk consultant to ascertain the level of threat that fire presented to this particular site. The intention from the outset however, was not to design a bushfire resistant house per se, but to develop a design which would place the owners in close proximity to the highly biodiverse heath vegetation of their site. The research aim was to find ways - through architectural design-to link the patterns of usage of the house with other site specific conditions related to the prevailing winds, solar orientation and seasonal change. The H-house has a number of features which increase the level of bushfire safety. These include: Fire rated roller shutters (tested by the CSIRO for ember attack and radiant heat), Fire resistant double glazing (on windows not protected by the shutters), Fibre-cement sheet cladding of the underside of the elevated timber floor structure, Manually operated high pressure sprinkler system on exposed timber decks, A fire refuge (an enlarged laundry, shower area) within the house with a dedicated cabinet for fire fighting equipment) and A low pressure solar powered domestic water supply system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The field was the curation of cross-cultural new media/ digital media practices within large-scale exhibition practices in China. The context was improved understandings of the intertwining of the natural and the artificial with respect to landscape and culture, and their consequent effect on our contemporary globalised society. The research highlighted new languages of media art with respect to landscape and their particular underpinning dialects. The methodology was principally practice-led. --------- The research brought together over 60 practitioners from both local and diasporic Asian, European and Australian cultures for the first time within a Chinese exhibition context. Through pursuing a strong response to both cultural displacement and re-identification the research forged and documented an enduring commonality within difference – an agenda further concentrated through sensitivities surrounding that year’s Beijing’s Olympics. In contrast to the severe threats posed to the local dialects of many of the world’s spoken and written languages the ‘Vernacular Terrain’ project evidenced that many local creative ‘dialects’ of the environment-media art continuum had indeed survived and flourished. --------- The project was co-funded by the Beijing Film Academy, QUT Precincts, IDAProjects and Platform China Art Institute. A broad range of peer-reviewed grants was won including from the Australia China Council and the Australian Embassy in China. Through invitations from external curators much of the work then traveled to other venues including the Block Gallery at QUT and the outdoor screens at Federation Square, Melbourne. The Vernacular Terrain catalogue featured a comprehensive history of the IDA project from 2000 to 2008 alongside several major essays. Due to the reputation IDA Projects had established, the team were invited to curate a major exhibition showcasing fifty new media artists: The Vernacular Terrain, at the prestigious Songzhang Art Museum, Beijing in Dec 07-Jan 2008. The exhibition was designed for an extensive, newly opened gallery owned by one of China's most important art historians Li Xian Ting. This exhibition was not only this gallery’s inaugural non-Chinese curated show but also the Gallery’s first new media exhibition. It included important works by artists such as Peter Greenway, Michael Roulier, Maleonn and Cui Xuiwen. --------- Each artist was chosen both for a focus upon their own local environmental concerns as well as their specific forms of practice - that included virtual world design, interactive design, video art, real time and manipulated multiplayer gaming platforms and web 2.0 practices. This exhibition examined the interconnectivities of cultural dialogue on both a micro and macro scale; incorporating the local and the global, through display methods and design approaches that stitched these diverse practices into a spatial map of meanings and conversations. By examining the contexts of each artist’s practice in relationship to the specificity of their own local place and prevailing global contexts the exhibition sought to uncover a global vernacular. Through pursuing this concentrated anthropological direction the research identified key themes and concerns of a contextual language that was clearly underpinned by distinctive local ‘dialects’ thereby contributing to a profound sense of cross-cultural association. Through augmentation of existing discourse the exhibition confirmed the enduring relevance and influence of both localized and globalised languages of the landscape-technology continuum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The field was the curation of new media within large-scale exhibition practice for Chinese audiences. The context was improved understandings of the intertwining cultures and concerns of Chinese and Western contemporary practitioners. The research uncovered a range of connective and dialogical concerns around cultural displacement and re-identification, germane to the chosen group of media artists. The methodology was principally practice-led. The research brought together 31 practitioners from Asian, European and Australasian cultures within a major highly visible Chinese exhibition context. By identifying and promoting a distinct commonality within difference amongst the diverse practitioners the exhibition successfully activated a global dialogue that incorporated environmental and cultural identity agendas within a major Chinese educational and public context - thereby promulgating cross-cultural understanding, despite the often oppressive shadowing of domestic political processes. The project was developed under the international aegis of IDA Projects (established since 1999) and was substantially supported by the Fine Art Department of the Beijing Film Academy, QUT Precincts and Platform China Art Institute. It built upon IDA’s 2005 inaugural new media exhibition at the ‘Today Art’ Museum in Beijing – now recognised as one of the leading art spaces in China. Numerous peer-reviewed grants won included the Australian Embassy in China and the Australia China Council. Through subsequent invitations from external curators the work then traveled in a range of reconfigured formats to other major venues including the Block Gallery at QUT, Brisbane and ZAIM Artspace, Yokohama Japan. A major catalogue with authoritative essays was also printed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explains, somewhat along a Simmelian line, that political theory may produce practical and universal theories like those developed in theoretical physics. The reasoning behind this paper is to show that the Element of Democracy Theory may be true by way of comparing it to Einstein’s Special Relativity – specifically concerning the parameters of symmetry, unification, simplicity, and utility. These parameters are what make a theory in physics as meeting them not only fits with current knowledge, but also produces paths towards testing (application). As the Element of Democracy Theory meets these same parameters, it could settle the debate concerning the definition of democracy. This will be shown firstly by discussing why no one has yet achieved a universal definition of democracy; secondly by explaining the parameters chosen (as in why these and not others confirm or scuttle theories); and thirdly by comparing how Special Relativity and the Element of Democracy match the parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hot and cold temperatures significantly increase mortality rates around the world, but which measure of temperature is the best predictor of mortality is not known. We used mortality data from 107 US cities for the years 1987–2000 and examined the association between temperature and mortality using Poisson regression and modelled a non-linear temperature effect and a non-linear lag structure. We examined mean, minimum and maximum temperature with and without humidity, and apparent temperature and the Humidex. The best measure was defined as that with the minimum cross-validated residual. We found large differences in the best temperature measure between age groups, seasons and cities, and there was no one temperature measure that was superior to the others. The strong correlation between different measures of temperature means that, on average, they have the same predictive ability. The best temperature measure for new studies can be chosen based on practical concerns, such as choosing the measure with the least amount of missing data.