250 resultados para tense and aspect


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two decades after its inception, Latent Semantic Analysis(LSA) has become part and parcel of every modern introduction to Information Retrieval. For any tool that matures so quickly, it is important to check its lore and limitations, or else stagnation will set in. We focus here on the three main aspects of LSA that are well accepted, and the gist of which can be summarized as follows: (1) that LSA recovers latent semantic factors underlying the document space, (2) that such can be accomplished through lossy compression of the document space by eliminating lexical noise, and (3) that the latter can best be achieved by Singular Value Decomposition. For each aspect we performed experiments analogous to those reported in the LSA literature and compared the evidence brought to bear in each case. On the negative side, we show that the above claims about LSA are much more limited than commonly believed. Even a simple example may show that LSA does not recover the optimal semantic factors as intended in the pedagogical example used in many LSA publications. Additionally, and remarkably deviating from LSA lore, LSA does not scale up well: the larger the document space, the more unlikely that LSA recovers an optimal set of semantic factors. On the positive side, we describe new algorithms to replace LSA (and more recent alternatives as pLSA, LDA, and kernel methods) by trading its l2 space for an l1 space, thereby guaranteeing an optimal set of semantic factors. These algorithms seem to salvage the spirit of LSA as we think it was initially conceived.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unsteady natural convection inside a triangular cavity has been studied in this study. The cavity is filled with a saturated porous medium with non-isothermal left inclined wall while the bottom surface is isothermally heated and the right inclined surface is isothermally cold. An internal heat generation is also considered which is dependent of the fluid temperature. The governing equations are solved numerically by finite element method. The Prandtl number of the fluid is considered as 0.7 (air) while the aspect ratio and the Rayleigh number are considered as 0.5 and 105 respectively. The effect of the porosity of the medium and heat generation on the fluid flow and heat transfer have been presented as a form of streamlines and isotherms. The rate of heat transfer through three surfaces of the enclosure is also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This edited book brings together empirical studies of young people in paid employment from a variety of disciplinary perspectives and in different national settings. In the context of increasing youth labour market participation rates and debates about the value of early employment, it draws on multi-level analyses to reflect the complexity of the field. Each of the three sections of the book explores a key aspect of young people's employment: their experience of work, intersections between work and education, and the impact of other actors and institutions. The book contributes to broadening and strengthening knowledge about the opportunities and constraints that young people face during their formative experiences in the labour market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complex networks have been studied extensively due to their relevance to many real-world systems such as the world-wide web, the internet, biological and social systems. During the past two decades, studies of such networks in different fields have produced many significant results concerning their structures, topological properties, and dynamics. Three well-known properties of complex networks are scale-free degree distribution, small-world effect and self-similarity. The search for additional meaningful properties and the relationships among these properties is an active area of current research. This thesis investigates a newer aspect of complex networks, namely their multifractality, which is an extension of the concept of selfsimilarity. The first part of the thesis aims to confirm that the study of properties of complex networks can be expanded to a wider field including more complex weighted networks. Those real networks that have been shown to possess the self-similarity property in the existing literature are all unweighted networks. We use the proteinprotein interaction (PPI) networks as a key example to show that their weighted networks inherit the self-similarity from the original unweighted networks. Firstly, we confirm that the random sequential box-covering algorithm is an effective tool to compute the fractal dimension of complex networks. This is demonstrated on the Homo sapiens and E. coli PPI networks as well as their skeletons. Our results verify that the fractal dimension of the skeleton is smaller than that of the original network due to the shortest distance between nodes is larger in the skeleton, hence for a fixed box-size more boxes will be needed to cover the skeleton. Then we adopt the iterative scoring method to generate weighted PPI networks of five species, namely Homo sapiens, E. coli, yeast, C. elegans and Arabidopsis Thaliana. By using the random sequential box-covering algorithm, we calculate the fractal dimensions for both the original unweighted PPI networks and the generated weighted networks. The results show that self-similarity is still present in generated weighted PPI networks. This implication will be useful for our treatment of the networks in the third part of the thesis. The second part of the thesis aims to explore the multifractal behavior of different complex networks. Fractals such as the Cantor set, the Koch curve and the Sierspinski gasket are homogeneous since these fractals consist of a geometrical figure which repeats on an ever-reduced scale. Fractal analysis is a useful method for their study. However, real-world fractals are not homogeneous; there is rarely an identical motif repeated on all scales. Their singularity may vary on different subsets; implying that these objects are multifractal. Multifractal analysis is a useful way to systematically characterize the spatial heterogeneity of both theoretical and experimental fractal patterns. However, the tools for multifractal analysis of objects in Euclidean space are not suitable for complex networks. In this thesis, we propose a new box covering algorithm for multifractal analysis of complex networks. This algorithm is demonstrated in the computation of the generalized fractal dimensions of some theoretical networks, namely scale-free networks, small-world networks, random networks, and a kind of real networks, namely PPI networks of different species. Our main finding is the existence of multifractality in scale-free networks and PPI networks, while the multifractal behaviour is not confirmed for small-world networks and random networks. As another application, we generate gene interactions networks for patients and healthy people using the correlation coefficients between microarrays of different genes. Our results confirm the existence of multifractality in gene interactions networks. This multifractal analysis then provides a potentially useful tool for gene clustering and identification. The third part of the thesis aims to investigate the topological properties of networks constructed from time series. Characterizing complicated dynamics from time series is a fundamental problem of continuing interest in a wide variety of fields. Recent works indicate that complex network theory can be a powerful tool to analyse time series. Many existing methods for transforming time series into complex networks share a common feature: they define the connectivity of a complex network by the mutual proximity of different parts (e.g., individual states, state vectors, or cycles) of a single trajectory. In this thesis, we propose a new method to construct networks of time series: we define nodes by vectors of a certain length in the time series, and weight of edges between any two nodes by the Euclidean distance between the corresponding two vectors. We apply this method to build networks for fractional Brownian motions, whose long-range dependence is characterised by their Hurst exponent. We verify the validity of this method by showing that time series with stronger correlation, hence larger Hurst exponent, tend to have smaller fractal dimension, hence smoother sample paths. We then construct networks via the technique of horizontal visibility graph (HVG), which has been widely used recently. We confirm a known linear relationship between the Hurst exponent of fractional Brownian motion and the fractal dimension of the corresponding HVG network. In the first application, we apply our newly developed box-covering algorithm to calculate the generalized fractal dimensions of the HVG networks of fractional Brownian motions as well as those for binomial cascades and five bacterial genomes. The results confirm the monoscaling of fractional Brownian motion and the multifractality of the rest. As an additional application, we discuss the resilience of networks constructed from time series via two different approaches: visibility graph and horizontal visibility graph. Our finding is that the degree distribution of VG networks of fractional Brownian motions is scale-free (i.e., having a power law) meaning that one needs to destroy a large percentage of nodes before the network collapses into isolated parts; while for HVG networks of fractional Brownian motions, the degree distribution has exponential tails, implying that HVG networks would not survive the same kind of attack.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract: The LiteSteel Beam (LSB) is a new cold-formed steel hollow flange channel beam recently developed in Australia. It is commonly used as a floor joist or bearer in buildings. Current practice in flooring systems is to include openings in the web element of floor joists or bearers so that building services can be located within them. Shear behaviour of LSBs with web openings is more complicated while their shear strengths are considerably reduced by the presence of web openings. However, no research has been undertaken on the shear behaviour and strength of LSBs with web openings. Therefore a detailed experimental study involving 26 shear tests was undertaken on simply supported LSB test specimens with web openings and an aspect ratio of 1.5. This paper presents the details of this experimental study and the results of their shear capacities and behavioural characteristics. Experimental results showed that the current design rules in cold-formed steel structures design codes are very conservative for the shear design of LSBs with web openings. Improved design equations have been proposed for the shear strength of LSBs with web openings based on the experimental results from this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines nascent entrepreneurship by comparing individuals engaged in nascent activities (n=380) with a control group (n=608), after screening a sample from the general population (n=30,427). The study then follows the developmental process of nascent entrepreneurs for 18 months. Bridging and bonding social capital, consisting of both strong and weak ties, was a robust predictor for nascent entrepreneurs, as well as for advancing through the start-up process. With regard to outcomes like first sale or showing a profit, only one aspect of social capital, viz. being a member of a business network, had a statistically significant positive effect. The study supports human capital in predicting entry into nascent entrepreneurship, but only weakly for carrying the start-up process towards successful completion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This PhD represents my attempt to make sense of my personal experiences of depression through the form of cabaret. I first experienced depression in 2006. Previously, I had considered myself to be a happy and optimistic person. I found the experience of depression to be a shock: both in the experience itself, and also in the way it effected my own self image. These personal experiences, together with my professional history as a songwriter and cabaret performer, have been the motivating force behind the research project. This study has explored the question: What are the implications of applying principles of Michael White’s narrative therapy to the creation of a cabaret performance about depression and bipolar disorder? There is a 50 percent weighting on the creative work, the cabaret performance Mind Games, and a 50 percent weighting on the written exegesis. This research has focussed on the illustration of therapeutic principles in order to play games of truth within a cabaret performance. The research project investigates ways of telling my own story in relation to others’ stories through three re-authoring principles articulated in Michael White’s narrative therapy: externalisation, an autonomous ethic of living and rich descriptions. The personal stories presented in the cabaret were drawn from my own experiences and from interviews with individuals with depression or bipolar disorder. The cabaret focussed on the illustration of therapeutic principles, and was not focussed on therapeutic ends for myself or the interviewees. The research question has been approached through a methodology combining autoethnographic, practice-led and action research. Auto ethnographic research is characterised by close investigation of assumptions, attitudes, and beliefs. The combination of autoethnographic, practice-led, action research has allowed me to bring together personal experiences of mental illness, research into therapeutic techniques, social attitudes and public discourses about mental illness and forms of contemporary cabaret to facilitate the creation of a one-woman cabaret performance. The exegesis begins with a discussion of games of truth as informed by Michel Foucault and Michael White and self-stigma as informed by Michael White and Erving Goffman. These concepts form the basis for a discussion of my own personal experiences. White’s narrative therapy is focused on individuals re-authoring their stories, or telling their stories in different ways. White’s principles are influenced by Foucault’s notions of truth and power. Foucault’s term games of truth has been used to describe the effect of a ‘truth in flux’ that occurs through White’s re-authoring process. This study argues that cabaret is an appropriate form to represent this therapeutic process because it favours heightened performativity over realism, and showcases its ‘constructedness’ and artificiality. Thus cabaret is well suited to playing games of truth. A contextual review compares two major cabaret trends, personal cabaret and provocative cabaret, in reference to the performer’s relationship with the audience in terms of distance and intimacy. The study draws a parallel between principles of distance and intimacy in Michael White’s narrative therapy and relates these to performative terms of distance and intimacy. The creative component of this study, the cabaret Mind Games, used principles of narrative therapy to present the character ‘Jo’ playing games of truth through: externalising an aspect of her personality (externalisation); exploring different life values (an autonomous ethic of living); and enacting multiple versions of her identity (rich descriptions). This constant shifting between distance and intimacy within the cabaret created the effect of a truth in ‘constant flux’, to use one of White’s terms. There are three inter-related findings in the study. The first finding is that the application of principles of White’s narrative therapy was able to successfully combine provocative and empathetic elements within the cabaret. The second finding is that the personal agenda of addressing my own self-stigma within the project limited the effective portrayal of a ‘truth in flux’ within the cabaret. The third finding presents the view that the cabaret expressed ‘Jo’ playing games of truth in order to journey towards her own "preferred identity claim" (White 2004b) through an act of "self care" (Foucault 2005). The contribution to knowledge of this research project is the application of therapeutic principles to the creation of a cabaret performance. This process has focussed on creating a self-revelatory cabaret that questions notions of a ‘fixed truth’ through combining elements of existing cabaret forms in new ways. Two major forms in contemporary cabaret, the personal cabaret and the provocative cabaret use the performer-audience relationship in distinctive ways. Through combining elements of these two cabaret forms, I have explored ways to create a provocative cabaret focussed on the act of self-revelation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled (non-separable) particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint (true theoretic) probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Each year the Australian Federal Treasury releases its Tax Expenditures Statement providing details of concessions, benefits, and incentives delivered through the tax regime to Australian taxpayers. The current Tax Expenditures Statement, released on 25 January 2008, lists approximately 300 tax expenditures and reports on the estimated pecuniary value in terms of revenue foregone, estimated to be a total of $50.12 billion for the 2006-07 financial year. Apart from the annual Tax Expenditures Statement, and despite the recurring fiscal impact, there is very little other scrutiny of Australia’s Federal tax expenditures program. This is despite tax expenditures often being seen as an alternative to direct expenditures with similar impact on the Federal budget. The object of tax expenditures is to provide government assistance and meet government objectives, and, as such, tax expenditures are departures from the revenue raising aspect of the tax regime. Within this context, this article examines the fundamental concept of tax expenditures as contrasted with direct expenditures and considers the role they play in the current tax regime.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unsteady natural convection inside a triangular cavity has been studied in this study. The cavity is filled with a saturated porous medium with non-isothermal left inclined wall while the bottom surface is isothermally heated and the right inclined surface is isothermally cooled. An internal heat generation is also considered which is dependent on the fluid temperature. The governing equations are solved numerically by finite volume method. The Prandtl number, Pr of the fluid is considered as 0.7 (air) while the aspect ratio and the Rayleigh number, Ra are considered as 0.5 and 105 respectively. The effect of heat generation on the fluid flow and heat transfer have been presented as a form of streamlines and isotherms. The rate of heat transfer through three surfaces of the enclosure is also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Individual-based models describing the migration and proliferation of a population of cells frequently restrict the cells to a predefined lattice. An implicit assumption of this type of lattice based model is that a proliferative population will always eventually fill the lattice. Here we develop a new lattice-free individual-based model that incorporates cell-to-cell crowding effects. We also derive approximate mean-field descriptions for the lattice-free model in two special cases motivated by commonly used experimental setups. Lattice-free simulation results are compared to these mean-field descriptions and to a corresponding lattice-based model. Data from a proliferation experiment is used to estimate the parameters for the new model, including the cell proliferation rate, showing that the model fits the data well. An important aspect of the lattice-free model is that the confluent cell density is not predefined, as with lattice-based models, but an emergent model property. As a consequence of the more realistic, irregular configuration of cells in the lattice-free model, the population growth rate is much slower at high cell densities and the population cannot reach the same confluent density as an equivalent lattice-based model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The National Road Safety Strategy 2011-2020 outlines plans to reduce the burden of road trauma via improvements and interventions relating to safe roads, safe speeds, safe vehicles, and safe people. It also highlights that a key aspect in achieving these goals is the availability of comprehensive data on the issue. The use of data is essential so that more in-depth epidemiologic studies of risk can be conducted as well as to allow effective evaluation of road safety interventions and programs. Before utilising data to evaluate the efficacy of prevention programs it is important for a systematic evaluation of the quality of underlying data sources to be undertaken to ensure any trends which are identified reflect true estimates rather than spurious data effects. However, there has been little scientific work specifically focused on establishing core data quality characteristics pertinent to the road safety field and limited work undertaken to develop methods for evaluating data sources according to these core characteristics. There are a variety of data sources in which traffic-related incidents and resulting injuries are recorded, which are collected for a variety of defined purposes. These include police reports, transport safety databases, emergency department data, hospital morbidity data and mortality data to name a few. However, as these data are collected for specific purposes, each of these data sources suffers from some limitations when seeking to gain a complete picture of the problem. Limitations of current data sources include: delays in data being available, lack of accurate and/or specific location information, and an underreporting of crashes involving particular road user groups such as cyclists. This paper proposes core data quality characteristics that could be used to systematically assess road crash data sources to provide a standardised approach for evaluating data quality in the road safety field. The potential for data linkage to qualitatively and quantitatively improve the quality and comprehensiveness of road crash data is also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Patients with chest pain contribute substantially to emergency department attendances, lengthy hospital stay, and inpatient admissions. A reliable, reproducible, and fast process to identify patients presenting with chest pain who have a low short-term risk of a major adverse cardiac event is needed to facilitate early discharge. We aimed to prospectively validate the safety of a predefined 2-h accelerated diagnostic protocol (ADP) to assess patients presenting to the emergency department with chest pain symptoms suggestive of acute coronary syndrome. Methods: This observational study was undertaken in 14 emergency departments in nine countries in the Asia-Pacific region, in patients aged 18 years and older with at least 5 min of chest pain. The ADP included use of a structured pre-test probability scoring method (Thrombolysis in Myocardial Infarction [TIMI] score), electrocardiograph, and point-of-care biomarker panel of troponin, creatine kinase MB, and myoglobin. The primary endpoint was major adverse cardiac events within 30 days after initial presentation (including initial hospital attendance). This trial is registered with the Australia-New Zealand Clinical Trials Registry, number ACTRN12609000283279. Findings: 3582 consecutive patients were recruited and completed 30-day follow-up. 421 (11•8%) patients had a major adverse cardiac event. The ADP classified 352 (9•8%) patients as low risk and potentially suitable for early discharge. A major adverse cardiac event occurred in three (0•9%) of these patients, giving the ADP a sensitivity of 99•3% (95% CI 97•9–99•8), a negative predictive value of 99•1% (97•3–99•8), and a specificity of 11•0% (10•0–12•2). Interpretation: This novel ADP identifies patients at very low risk of a short-term major adverse cardiac event who might be suitable for early discharge. Such an approach could be used to decrease the overall observation periods and admissions for chest pain. The components needed for the implementation of this strategy are widely available. The ADP has the potential to affect health-service delivery worldwide.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is no doubt that information technology (IT) resources are important for organisations in any jurisdiction to manage their processes. Organisations consume considerable financial resources to acquire and manage their IT resources with various IT governance structures. Investment in IT, thus, is a strategic necessity. IT resources, however, do not contribute fully to business value on their own. Business value considers performance impacts of resources at various organisational levels (e.g., processes and firm levels). ITs are biased resources in that they require some form of manipulation to attain their maximum value. While we know that IT resources are important, a deeper understanding on two aspects of use of IT resources in organisations is important. First, is how to leverage the IT resources to attain its maximum value, and second, is where to evaluate IT-related business value in the organisation’s value chain. This understanding is important for organisation to sustain their operations in an ever-changing business environment. We address these issues in two parts. This paper discusses the first aspect of ways in which organisations can create and sustain their IT-related business value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A deeper understanding on two aspects of use of IT resources in organisations is important to ensure sustainable investment in these IT resources. The first is how to leverage the IT resources to attain its maximum value. We discussed this aspect of use of IT resources in part 1 of this series. This discussion suggested a complementary approach as a first stage of IT business value creation, and dynamic capabilities approach to secure sustainable IT-related business value from the IT resources. The second important aspect of IT business value is where to evaluate IT-related business value in the organisations value chains. This understanding is important for organisations to ensure appropriate accountability of the investment and management of IT resources. We address this issue in this second part of the two part series.