30 resultados para Chain of custody of traces
Resumo:
MADAM, Androgenetic alopecia (AGA) is a common age-dependent trait, characterized by a progressive loss of hair from the scalp. The hair loss may commence during puberty and up to 80% of white men experience some degree of AGA during their lifetime.1 Research has established that two essential aetiological factors for AGA are a genetic predisposition and the presence of androgens (male sex hormones).1,2 A recent meta-analysis of genome-wide association studies (GWAS) has increased the number of identified loci associated with this trait at the molecular level to a total of eight.3 However, despite these successes, a large fraction of the genetic contribution remains to be identified. One way to identify further genetic loci is to combine the resource of GWAS datasets with knowledge about specific biological factors likely to be involved in the development of disease. The focused evaluation of a limited number of candidate genes in GWAS datasets avoids the necessity for extensive correction for multiple testing, which typically limits the power for detecting genetic loci at a genome-wide level.4 Because the presence of genetic association suggests that candidate genes are likely to operate early in the causative chain of events leading to the phenotype, this approach may also function to favour biological pathways for their importance in the development of AGA.
Resumo:
In this paper we consider HCI's role in technology interventions for health and well-being. Three projects carried out by the authors are analysed by appropriating the idea of a value chain to chart a causal history from proximal effects generated in early episodes of design through to distal health and well-being outcomes. Responding to recent arguments that favour bounding HCI's contribution to local patterns of use, we propose an unbounded view of HCI that addresses an extended value chain of influence. We discuss a view of HCI methods as mobilising this value chain perspective in multi-disciplinary collaborations through its emphasis on early prototyping and naturalistic studies of use.
Resumo:
In today's dynamic and turbulent environment companies are required to increase their effectiveness and efficiency, exploit synergy and learn product innovation processes in order to build competitive advantage. To be able to stimulate and facilitate learning in product innovation, it is necessary to gain an insight into factors that hinder learning and to design effective intervention strategies that may help remove barriers to learning. This article reports on learning barriers identified by product innovation managers in over 70 companies in the UK, Ireland, Italy, Netherlands, Sweden and Australia. The results show that the majority of the barriers identified can be labelled as organisational defensive routines leading to a chain of behaviours; lack of resources leads to under-appreciation of the value of valid information, absence of informed choice and lack of personal responsibility. An intervention theory is required which enables individuals and organisations to interrupt defensive patterns in ways that prevents them from recurring.
Resumo:
Railways in Hong Kong have been one of the few successful stories in the major metropolitan cities around the world, not only for their profit-making operation but also the efficiency in dealing with the astonishingly high traffic demands every day. While railway operations require a chain of delicate systems working in harmony all the time, there are numerous engineering problems arising and jeopardising the quality of services. Research in various railway engineering problems is therefore a must to tackle these problems. This paper highlights the railway research works in Hong Kong and discusses their relevance to Mainland China.
Resumo:
In recent years, there is a dramatic growth in number and popularity of online social networks. There are many networks available with more than 100 million registered users such as Facebook, MySpace, QZone, Windows Live Spaces etc. People may connect, discover and share by using these online social networks. The exponential growth of online communities in the area of social networks attracts the attention of the researchers about the importance of managing trust in online environment. Users of the online social networks may share their experiences and opinions within the networks about an item which may be a product or service. The user faces the problem of evaluating trust in a service or service provider before making a choice. Recommendations may be received through a chain of friends network, so the problem for the user is to be able to evaluate various types of trust opinions and recommendations. This opinion or recommendation has a great influence to choose to use or enjoy the item by the other user of the community. Collaborative filtering system is the most popular method in recommender system. The task in collaborative filtering is to predict the utility of items to a particular user based on a database of user rates from a sample or population of other users. Because of the different taste of different people, they rate differently according to their subjective taste. If two people rate a set of items similarly, they share similar tastes. In the recommender system, this information is used to recommend items that one participant likes, to other persons in the same cluster. But the collaborative filtering system performs poor when there is insufficient previous common rating available between users; commonly known as cost start problem. To overcome the cold start problem and with the dramatic growth of online social networks, trust based approach to recommendation has emerged. This approach assumes a trust network among users and makes recommendations based on the ratings of the users that are directly or indirectly trusted by the target user.
Resumo:
know personally. They also communicate with other members of the network who are the friends of their friends and may be friends of their friends network. They share their experiences and opinions within the social network about an item which may be a product or service. The user faces the problem of evaluating trust in a service or service provider before making a choice. Opinions, reputations and ecommendations will influence users' choice and usage of online resources. Recommendations may be received through a chain of friends of friends, so the problem for the user is to be able to evaluate various types of trust recommendations and reputations. This opinion or ecommendation has a great influence to choose to use or enjoy the item by the other user of the community. Users share information on the level of trust they explicitly assign to other users. This trust can be used to determine while taking decision based on any recommendation. In case of the absence of direct connection of the recommender user, propagated trust could be useful.
Resumo:
This volume breaks new ground by approaching Socially Responsible Investment (SRI) as an explicitly ethical practice in financial markets. The work explains the philosophical and practical shortcomings of long term shareholder value and the origins and conceptual structure of SRI, and links its pursuit to both its deeper philosophical foundations and the broader, multi-dimensional global movement towards greater social responsibility in global markets. Interviews with fund managers in the Australian SRI sector generate recommendations for better integrating ethics into SRI practice via ethically informed engagement with invested companies, and an in-depth discussion of the central practical SRI issue of fiduciary responsibility strengthens the case in favour of SRI. The practical and ethical theoretical perspectives are then brought together to sketch out an achievable ideal for SRI worldwide, in which those who are involved in investment and business decisions become part of an ethical chain of decision makers linking the ultimate owners of capital with the business executives who frame, advocate and implement business strategies. In between there are investment advisors, fund managers, business analysts and boards. The problem lies in the fact that the ultimate owners are discouraged from considering their own values, or even their own long term interests, whilst the others often look only to short term interests. The solution lies in the latter recognising themselves as links in the ethical chain.
Resumo:
The construction of timelines of computer activity is a part of many digital investigations. These timelines of events are composed of traces of historical activity drawn from system logs and potentially from evidence of events found in the computer file system. A potential problem with the use of such information is that some of it may be inconsistent and contradictory thus compromising its value. This work introduces a software tool (CAT Detect) for the detection of inconsistency within timelines of computer activity. We examine the impact of deliberate tampering through experiments conducted with our prototype software tool. Based on the results of these experiments, we discuss techniques which can be employed to deal with such temporal inconsistencies.
Resumo:
Rapid urbanisation and resulting continuous increase in traffic has been recognised as key factors in the contribution of increased pollutant loads to urban stormwater and in turn to receiving waters. Urbanisation primarily increases anthropogenic activities and the percentage of impervious surfaces in urban areas. These processes are collectively responsible for urban stormwater pollution. In this regard, urban traffic and land use related activities have been recognised as the primary pollutant sources. This is primarily due to the generation of a range of key pollutants such as solids, heavy metals and PAHs. Appropriate treatment system design is the most viable approach to mitigate stormwater pollution. However, limited understanding of the pollutant process and transport pathways constrains effective treatment design. This highlights necessity for the detailed understanding of traffic and other land use related pollutants processes and pathways in relation to urban stormwater pollution. This study has created new knowledge in relation to pollutant processes and transport pathways encompassing atmospheric pollutants, atmospheric deposition and build-up on ground surfaces of traffic generated key pollutants. The research study was primarily based on in-depth experimental investigations. This thesis describes the extensive knowledge created relating to the processes of atmospheric pollutant build-up, atmospheric deposition and road surface build-up and establishing their relationships as a chain of processes. The analysis of atmospheric deposition revealed that both traffic and land use related sources contribute total suspended particulate matter (TSP) to the atmosphere. Traffic sources become dominant during weekdays whereas land use related sources become dominant during weekends due to the reduction in traffic sources. The analysis further concluded that atmospheric TSP, polycyclic aromatic hydrocarbons (PAHs) and heavy metals (HMs) concentrations are highly influenced by total average daily heavy duty traffic, traffic congestion and the fraction of commercial and industrial land uses. A set of mathematical equation were developed to predict TSP, PAHs and HMs concentrations in the atmosphere based on the influential traffic and land use related parameters. Dry deposition samples were collected for different antecedent dry days and wet deposition samples were collected immediately after rainfall events. The dry deposition was found to increase with the antecedent dry days and consisted of relatively coarser particles (greater than 1.4 m) when compared to wet deposition. The wet deposition showed a strong affinity to rainfall depth, but was not related to the antecedent dry period. It was also found that smaller size particles (less than 1.4 m) travel much longer distances from the source and deposit mainly with the wet deposition. Pollutants in wet deposition are less sensitive to the source characteristics compared to dry deposition. Atmospheric deposition of HMs is not directly influenced by land use but rather by proximity to high emission sources such as highways. Therefore, it is important to consider atmospheric deposition as a key pollutant source to urban stormwater in the vicinity of these types of sources. Build-up was analysed for five different particle size fractions, namely, <1 m, 1-75 m, 75-150 m, 150-300 m and >300 m for solids, PAHs and HMs. The outcomes of the study indicated that PAHs and HMs in the <75 m size fraction are generated mainly by traffic related activities whereas the > 150 m size fraction is generated by both traffic and land use related sources. Atmospheric deposition is an important source for HMs build-up on roads, whereas the contribution of PAHs from atmospheric sources is limited. A comprehensive approach was developed to predict traffic and other land use related pollutants in urban stormwater based on traffic and other land use characteristics. This approach primarily included the development of a set of mathematical equations to predict traffic generated pollutants by linking traffic and land use characteristics to stormwater quality through mathematical modelling. The outcomes of this research will contribute to the design of appropriate treatment systems to safeguard urban receiving water quality for future traffic growth scenarios. The real world. application of knowledge generated was demonstrated through mathematical modelling of solids in urban stormwater, accounting for the variability in traffic and land use characteristics.
Resumo:
High-stakes literacy testing is now a ubiquitous educational phenomenon. However, it remains a relatively recent phenomenon in Australia. Hence it is possible to study the ways in which such tests are reorganising educators work during this period of change. This paper draws upon Dorothy Smiths Institutional Ethnography and critical policy analysis to consider this problem and reports on interview data from teachers and the principal in small rural school in a poor area of South Australia. In this context high-stakes testing and the associated diagnostic school review unleashes a chain of actions within the school which ultimately results in educators doubting their professional judgments, increasing the investment in testing, narrowing their teaching of literacy and purchasing levelled reading schemes. The effects of high-stakes testing in disadvantaged schools are identified and discussed.
Resumo:
This study focuses on trying to understand why the range of experience with respect to HIV infection is so diverse, especially as regards to the latency period. The challenge is to determine what assumptions can be made about the nature of the experience of antigenic invasion and diversity that can be modelled, tested and argued plausibly. To investigate this, an agent-based approach is used to extract high-level behaviour which cannot be described analytically from the set of interaction rules at the cellular level. A prototype model encompasses local variation in baseline properties contributing to the individual disease experience and is included in a network which mimics the chain of lymphatic nodes. Dealing with massively multi-agent systems requires major computational efforts. However, parallelisation methods are a natural consequence and advantage of the multi-agent approach. These are implemented using the MPI library.
Resumo:
Characterization of the epigenetic profile of humans since the initial breakthrough on the human genome project has strongly established the key role of histone modifications and DNA methylation. These dynamic elements interact to determine the normal level of expression or methylation status of the constituent genes in the genome. Recently, considerable evidence has been put forward to demonstrate that environmental stress implicitly alters epigenetic patterns causing imbalance that can lead to cancer initiation. This chain of consequences has motivated attempts to computationally model the influence of histone modification and DNA methylation in gene expression and investigate their intrinsic interdependency. In this paper, we explore the relation between DNA methylation and transcription and characterize in detail the histone modifications for specific DNA methylation levels using a stochastic approach.
Resumo:
Crashes at any particular transport network location consist of a chain of events arising from a multitude of potential causes and/or contributing factors whose nature is likely to reflect geometric characteristics of the road, spatial effects of the surrounding environment, and human behavioural factors. It is postulated that these potential contributing factors do not arise from the same underlying risk process, and thus should be explicitly modelled and understood. The state of the practice in road safety network management applies a safety performance function that represents a single risk process to explain crash variability across network sites. This study aims to elucidate the importance of differentiating among various underlying risk processes contributing to the observed crash count at any particular network location. To demonstrate the principle of this theoretical and corresponding methodological approach, the study explores engineering (e.g. segment length, speed limit) and unobserved spatial factors (e.g. climatic factors, presence of schools) as two explicit sources of crash contributing factors. A Bayesian Latent Class (BLC) analysis is used to explore these two sources and to incorporate prior information about their contribution to crash occurrence. The methodology is applied to the state controlled roads in Queensland, Australia and the results are compared with the traditional Negative Binomial (NB) model. A comparison of goodness of fit measures indicates that the model with a double risk process outperforms the single risk process NB model, and thus indicating the need for further research to capture all the three crash generation processes into the SPFs.
Resumo:
English is currently ascendant as the language of globalisation, evident in its mediation of interactions and transactions worldwide. For many international students, completion of a degree in English means significant credentialing and increased job prospects. Australian universities are the third largest English-speaking destination for overseas students behind the United States and the United Kingdom. International students comprise one-fifth of the total Australian university population, with 80% coming from Asian countries (ABS, 2010). In this competitive higher education market, English has been identified as a valued good. Indeed, universities have been critiqued for relentlessly reproducing the hegemony and homogeneity of English (Marginson, 2006, p. 37) in order to sustain their advantage in the education market. For international students, English is the gatekeeper to enrolment, the medium of instruction and the mediator of academic success. For these reasons, English is not benign, yet it remains largely taken-for-granted in the mainstream university context. This paper problematises the naturalness of English and reports on a study of an Australian Master of Education course in which English was a focus. The study investigated representations of English as they were articulated across a chain of texts including the university strategic plan, course assessment criteria, student assignments, lecturer feedback, and interviews. Critical Discourse Analysis (CDA) and Foucaults work on discourse enabled understandings of how a particular English is formed through an apparatus of specifications, exclusionary thresholds, strategies for maintenance (and disruption), and privileged concepts and speaking positions. The findings indicate that English has hegemonic status within the Australian university, with material consequences for students whose proficiency falls outside the thresholds of accepted English practice. Central to the constitution of what counts as English is the relationship of equivalence between standard written English and successful academic writing. International students representations of English indicate a discourse that impacts on identities and practices and preoccupies them considerably as they negotiate language and task demands. For the lecturer, there is strategic manoeuvring within the institutional regulative regime to support students English language needs using adapted assessment practices, explicit teaching of academic genres and scaffolded classroom interaction. The paper concludes with the implications for university teaching and learning.
Resumo:
This paper addresses the following predictive business process monitoring problem: Given the execution trace of an ongoing case,and given a set of traces of historical (completed) cases, predict the most likely outcome of the ongoing case. In this context, a trace refers to a sequence of events with corresponding payloads, where a payload consists of a set of attribute-value pairs. Meanwhile, an outcome refers to a label associated to completed cases, like, for example, a label indicating that a given case completed on time (with respect to a given desired duration) or late, or a label indicating that a given case led to a customer complaint or not. The paper tackles this problem via a two-phased approach. In the first phase, prefixes of historical cases are encoded using complex symbolic sequences and clustered. In the second phase, a classifier is built for each of the clusters. To predict the outcome of an ongoing case at runtime given its (uncompleted) trace, we select the closest cluster(s) to the trace in question and apply the respective classifier(s), taking into account the Euclidean distance of the trace from the center of the clusters. We consider two families of clustering algorithms hierarchical clustering and k-medoids and use random forests for classification. The approach was evaluated on four real-life datasets.