83 resultados para two-mass model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

People have adopted various formats of media such as graphics, photo and text (nickname) in order to represent themselves when communicate with others online. Avatar is known as a visual form representing a user oneself and one's identity wished. Its form can vary from a two-dimensional model to a three-dimensional model, and can be visualised with various visual forms and styles. In general, two-dimensional images including an animated image are used in online forum communities and live chat software; while three-dimensional models are often used in computer games. Avatar design is often regarded as a graphic designer's visual image creation or a user's output based on one's personal preference, yet it often causes the avatar design having no consideration of its practical visual design and users' interactive communication experience aspects. This paper will review various types and styles of avatar and discuss about avatar design from visual design and online user experience perspectives. It aims to raise a design discourse in avatar design and build up a well-articulated set of design principles for effective avatar design.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The melting of spherical nanoparticles is considered from the perspective of heat flow in a pure material and as a moving boundary (Stefan) problem. The dependence of the melting temperature on both the size of the particle and the interfacial tension is described by the Gibbs-Thomson effect, and the resulting two-phase model is solved numerically using a front-fixing method. Results show that interfacial tension increases the speed of the melting process, and furthermore, the temperature distribution within the solid core of the particle exhibits behaviour that is qualitatively different to that predicted by the classical models without interfacial tension.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Lamb waves propagation in composite materials has been studied extensively since it was first observed in 1982. In this paper, we show a procedure to simulate the propagation of Lamb waves in composite laminates using a two-dimensional model in ANSYS. This is done by simulating the Lamb waves propagating along the plane of the structure in the form of a time dependent force excitation. In this paper, an 8-layered carbon reinforced fibre plastic (CRFP) is modelled as transversely isotropic and dissipative medium and the effect of flaws is analyzed with respect to the defects induced between various layers of the composite laminate. This effort is the basis for the future development of a 3D model for similar applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper contributes to the literature on subjective well-being (SWB) by taking into account different aspects of life, called domains, such as health, financial situation, job, leisure, housing, and environment. We postulate a two-layer model where individual total SWB depends on the different subjective domain satisfactions. A distinction is made between long-term and short-term effects. The individual domain satisfactions depend on objectively measurable variables, such as income. The model is estimated using a large German panel data set.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The thermal analysis of euchroite shows two mass loss steps in the temperature range 100 to 105°C and 185 to 205°C. These mass loss steps are attributed to dehydration and dehydroxylation of the mineral. Hot stage Raman spectroscopy (HSRS) has been used to study the thermal stability of the mineral euchroite, a mineral involved in a complex set of equilibria between the copper hydroxy arsenates: euchroite Cu2(AsO4)(OH).3H2O → olivenite Cu2(AsO4)(OH) → strashimirite Cu8(AsO4)4(OH)4.5H2O → arhbarite Cu2Mg(AsO4)(OH)3. Hot stage Raman spectroscopy inolves the collection of Raman spectra as a function of the temperature. HSRS shows that the mineral euchroite decomposes between 125 and 175 °C with the loss of water. At 125 °C, Raman bands are observed at 858 cm-1 assigned to the ν1 AsO43- symmetric stretching vibration and 801, 822 and 871 cm-1 assigned to the ν3 AsO43- (A1) antisymmetric stretching vibration. A distinct band shift is observed upon heating to 275 °C. At 275 °C the four Raman bands are resolved at 762, 810, 837 and 862 cm-1. Further heating results in the diminution of the intensity in the Raman spectra and this is attributed to sublimation of the arsenate mineral. Hot stage Raman spectroscopy is most useful technique for studying the thermal stability of minerals especially when only very small amounts of mineral are available.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: The Brief Michigan Alcoholism Screening Test (bMAST) is a 10-item test derived from the 25-item Michigan Alcoholism Screening Test (MAST). It is widely used in the assessment of alcohol dependence. In the absence of previous validation studies, the principal aim of this study was to assess the validity and reliability of the bMAST as a measure of the severity of problem drinking. Method: There were 6,594 patients (4,854 men, 1,740 women) who had been referred for alcohol-use disorders to a hospital alcohol and drug service who voluntarily participated in this study. Results: An exploratory factor analysis defined a two-factor solution, consisting of Perception of Current Drinking and Drinking Consequences factors. Structural equation modeling confirmed that the fit of a nine-item, two-factor model was superior to the original one-factor model. Concurrent validity was assessed through simultaneous administration of the Alcohol Use Disorders Identification Test (AUDIT) and associations with alcohol consumption and clinically assessed features of alcohol dependence. The two-factor bMAST model showed moderate correlations with the AUDIT. The two-factor bMAST and AUDIT were similarly associated with quantity of alcohol consumption and clinically assessed dependence severity features. No differences were observed between the existing weighted scoring system and the proposed simple scoring system. Conclusions: In this study, both the existing bMAST total score and the two-factor model identified were as effective as the AUDIT in assessing problem drinking severity. There are additional advantages of employing the two-factor bMAST in the assessment and treatment planning of patients seeking treatment for alcohol-use disorders. (J. Stud. Alcohol Drugs 68: 771-779,2007)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The effect of conversion from forest-to-pasture upon soil carbon stocks has been intensively discussed, but few studies focus on how this land-use change affects carbon (C) distribution across soil fractions in the Amazon basin. We investigated this in the 20 cm depth along a chronosequence of sites from native forest to three successively older pastures. We performed a physicochemical fractionation of bulk soil samples to better understand the mechanisms by which soil C is stabilized and evaluate the contribution of each C fraction to total soil C. Additionally, we used a two-pool model to estimate the mean residence time (MRT) for the slow and active pool C in each fraction. Soil C increased with conversion from forest-to-pasture in the particulate organic matter (> 250 mu m), microaggregate (53-250 mu m), and d-clay (< 2 mu m) fractions. The microaggregate comprised the highest soil C content after the conversion from forest-to-pasture. The C content of the d-silt fraction decreased with time since conversion to pasture. Forest-derived C remained in all fractions with the highest concentration in the finest fractions, with the largest proportion of forest-derived soil C associated with clay minerals. Results from this work indicate that microaggregate formation is sensitive to changes in management and might serve as an indicator for management-induced soil carbon changes, and the soil C changes in the fractions are dependent on soil texture.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Understanding the business value of IT has mostly been studied in developed countries, but because most investment in developing countries is derived from external sources, the influence of that investment on business value is likely to be different. We test this notion using a two-layer model. We examine the impact of IT investments on firm processes, and the relationship of these processes to firm performance in a developing country. Our findings suggest that investment in different areas of IT positively relates to improvements in intermediate business processes and these intermediate business processes positively relate to the overall financial performance of firms in a developing country.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This book provides the much needed international dimension on the payoffs of information technology investments. The bulk of the research on the impact of information technology investments has been undertaken in developed economies, mainly the United States. This research provides an alternative dimension - a developing country perspective on how information technology investments impacts organizations. Secondly, there has been much debate and controversy on how we measure information technology investment payoffs. This research uses an innovative two-stage model where it proposes that information technology investments will first impact the process and improvement in the processes will then impact the performance. In doing so, it considers sectors of information technology investment rather than taking it as one. Finally, almost all prior studies in this area have considered only the tangible impact of information technology investments. This research proposes that one can only better understand the benefits by looking at both the tangible and intangible benefits.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Extreme cold and heat waves, characterised by a number of cold or hot days in succession, place a strain on people’s cardiovascular and respiratory systems. The increase in deaths due to these waves may be greater than that predicted by extreme temperatures alone. We examined cold and heat waves in 99 US cities for 14 years (1987–2000) and investigated how the risk of death depended on the temperature threshold used to define a wave, and a wave’s timing, duration and intensity. We defined cold and heat waves using temperatures above and below cold and heat thresholds for two or more days. We tried five cold thresholds using the first to fifth percentiles of temperature, and five heat thresholds using the ninety-fifth to ninety-ninth percentiles. The extra wave effects were estimated using a two-stage model to ensure that their effects were estimated after removing the general effects of temperature. The increases in deaths associated with cold waves were generally small and not statistically significant, and there was even evidence of a decreased risk during the coldest waves. Heat waves generally increased the risk of death, particularly for the hottest heat threshold. Cold waves of a colder intensity or longer duration were not more dangerous. Cold waves earlier in the cool season were more dangerous, as were heat waves earlier in the warm season. In general there was no increased risk of death during cold waves above the known increased risk associated with cold temperatures. Cold or heat waves earlier in the cool or warm season may be more dangerous because of a build up in the susceptible pool or a lack of preparedness for cold or hot temperatures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The standard approach to tax compliance applies the economics-of-crime methodology pioneered by Becker (1968): in its first application, due to Allingham and Sandmo (1972) it models the behaviour of agents as a decision involving a choice of the extent of their income to report to tax authorities, given a certain institutional environment, represented by parameters such as the probability of detection and penalties in the event the agent is caught. While this basic framework yields important insights on tax compliance behavior, it has some critical limitations. Specifically, it indicates a level of compliance that is significantly below what is observed in the data. This thesis revisits the original framework with a view towards addressing this issue, and examining the political economy implications of tax evasion for progressivity in the tax structure. The approach followed involves building a macroeconomic, dynamic equilibrium model for the purpose of examining these issues, by using a step-wise model building procedure starting with some very simple variations of the basic Allingham and Sandmo construct, which are eventually integrated to a dynamic general equilibrium overlapping generations framework with heterogeneous agents. One of the variations involves incorporating the Allingham and Sandmo construct into a two-period model of a small open economy of the type originally attributed to Fisher (1930). A further variation of this simple construct involves allowing agents to initially decide whether to evade taxes or not. In the event they decide to evade, the agents then have to decide the extent of income or wealth they wish to under-report. We find that the ‘evade or not’ assumption has strikingly different and more realistic implications for the extent of evasion, and demonstrate that it is a more appropriate modeling strategy in the context of macroeconomic models, which are essentially dynamic in nature, and involve consumption smoothing across time and across various states of nature. Specifically, since deciding to undertake tax evasion impacts on the consumption smoothing ability of the agent by creating two states of nature in which the agent is ‘caught’ or ‘not caught’, there is a possibility that their utility under certainty, when they choose not to evade, is higher than the expected utility obtained when they choose to evade. Furthermore, the simple two-period model incorporating an ‘evade or not’ choice can be used to demonstrate some strikingly different political economy implications relative to its Allingham and Sandmo counterpart. In variations of the two models that allow for voting on the tax parameter, we find that agents typically choose to vote for a high degree of progressivity by choosing the highest available tax rate from the menu of choices available to them. There is, however, a small range of inequality levels for which agents in the ‘evade or not’ model vote for a relatively low value of the tax rate. The final steps in the model building procedure involve grafting the two-period models with a political economy choice into a dynamic overlapping generations setting with more general, non-linear tax schedules and a ‘cost-of evasion’ function that is increasing in the extent of evasion. Results based on numerical simulations of these models show further improvement in the model’s ability to match empirically plausible levels of tax evasion. In addition, the differences between the political economy implications of the ‘evade or not’ version of the model and its Allingham and Sandmo counterpart are now very striking; there is now a large range of values of the inequality parameter for which agents in the ‘evade or not’ model vote for a low degree of progressivity. This is because, in the ‘evade or not’ version of the model, low values of the tax rate encourages a large number of agents to choose the ‘not-evade’ option, so that the redistributive mechanism is more ‘efficient’ relative to the situations in which tax rates are high. Some further implications of the models of this thesis relate to whether variations in the level of inequality, and parameters such as the probability of detection and penalties for tax evasion matter for the political economy results. We find that (i) the political economy outcomes for the tax rate are quite insensitive to changes in inequality, and (ii) the voting outcomes change in non-monotonic ways in response to changes in the probability of detection and penalty rates. Specifically, the model suggests that changes in inequality should not matter, although the political outcome for the tax rate for a given level of inequality is conditional on whether there is a large or small or large extent of evasion in the economy. We conclude that further theoretical research into macroeconomic models of tax evasion is required to identify the structural relationships underpinning the link between inequality and redistribution in the presence of tax evasion. The models of this thesis provide a necessary first step in that direction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study compared the performance of a local and three robust optimality criteria in terms of the standard error for a one-parameter and a two-parameter nonlinear model with uncertainty in the parameter values. The designs were also compared in conditions where there was misspecification in the prior parameter distribution. The impact of different correlation between parameters on the optimal design was examined in the two-parameter model. The designs and standard errors were solved analytically whenever possible and numerically otherwise.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Differing parental considerations for girls and boys in households are a primary cause of the gender gap in school enrolment and educational attainment in developing countries, particularly in Sub-Saharan Africa and South Asia. While a number of studies have focused on the inequality of educational opportunities in South Asia, little is known about Bhutan. This study uses recent household expenditure data from the Bhutan Living Standard Survey to evaluate the gender gap in the allocation of resources for schooling. The findings, based on cross-sectional as well as household fixed-effect approaches, suggest that girls are less likely to enrol in school but are not allocated fewer resources once they are enrolled.