905 resultados para Web Mining, Data Mining, User Topic Model, Web User Profiles


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information processing speed, as measured by elementary cognitive tasks, is correlated with higher order cognitive ability so that increased speed relates to improved cognitive performance. The question of whether the genetic variation in Inspection Time (IT) and Choice Reaction Time (CRT) is associated with IQ through a unitary factor was addressed in this multivariate genetic study of IT, CRT, and IQ subtest scores. The sample included 184 MZ and 206 DZ twin pairs with a mean age of 16.2 years (range 15-18 years). They were administered a visual (pi-figure) IT task, a two-choice RT task, five computerized subtests of the Multidimensional Aptitude Battery, and the digit symbol substitution subtest from the WAIS-R. The data supported a factor model comprising a general, three group (verbal ability, visuospatial ability, broad speediness), and specific genetic factor structure, a shared environmental factor influencing all tests but IT, plus unique environmental factors that were largely specific to individual measures. The general genetic factor displayed factor loadings ranging between 0.35 and 0.66 for the IQ subtests, with IT and CRT loadings of -0.47 and -0.24, respectively. Results indicate that a unitary factor is insufficient to describe the entire relationship between cognitive speed measures and all IQ subtests, with independent genetic effects explaining further covariation between processing speed (especially CRT) and Digit Symbol.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we propose a new identification method based on the residual white noise autoregressive criterion (Pukkila et al. , 1990) to select the order of VARMA structures. Results from extensive simulation experiments based on different model structures with varying number of observations and number of component series are used to demonstrate the performance of this new procedure. We also use economic and business data to compare the model structures selected by this order selection method with those identified in other published studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: This study extended that of Kwon and Oei [Kwon, S.M., Oei, T.P.S., 2003. Cognitive change processes in a group cognitive behavior therapy of depression. J. Behav. Ther. Exp. Psychiatry, 3, 73-85], which outlined a number of testable models based on Beck's cognitive theory of depression. Specifically, the current study tested the following four competing models: the causal, consequential, fully and partially interactive cognitive models in patients with major depressive disorder. Methods: A total of 168 clinically depressed outpatients were recruited into a 12-week group cognitive behaviour therapy program. Data was collected at three time points: baseline, mid- and at termination of therapy using the ATQ DAS and BD1. The data were analysed with Amos 4.01 (Arbuckle, J.L., 1999. Amos 4.1. Smallwaters, Chicago.) structural equation modelling. Results: Results indicated that dysfunctional attitudes, negative automatic thoughts and symptoms of depression reduced significantly during treatment. Both the causal and consequential models equally provided an adequate fit to the data. The fully interactive model provided the best fit. However, after removing non-significant pathways, it was found that reduced depressive symptom contributed to reduced depressogenic automatic thoughts and dysfunctional attitudes, not the reverse. Conclusion: These findings did not fully support Beck's cognitive theory of depression that cognitions are primary in the reduction of depressed mood. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Security protocols are often modelled at a high level of abstraction, potentially overlooking implementation-dependent vulnerabilities. Here we use the Z specification language's rich set of data structures to formally model potentially ambiguous messages that may be exploited in a 'type flaw' attack. We then show how to formally verify whether or not such an attack is actually possible in a particular protocol using Z's schema calculus.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho apresenta um estudo de caso sobre o papel da Câmara Técnica de Pesca (CTP) do Consórcio Intermunicipal Lagos São João (CILSJ) na mediação de conflitos de gestão da pesca artesanal na região da Lagoa de Araruama (LA). A CTP é gerida pelas prioridades da Política Nacional de Recursos Hídricos (PNRH), preservacionista, em oposição a Política Nacional de Desenvolvimento Sustentável da Atividade Pesqueira (PNDSAP), com fins de exploração econômica. Na Bacia Hidrográfica Lagos São João, o CILSJ, prioriza as ações de conservação e manutenção dos corpos de água para o abastecimento, deixando em segundo plano a recuperação dos estuários em que lança o esgoto tratado e o não tratado, justamente os locais onde ocorre a pesca. O sujeito da pesquisa foi a representação dos pescadores, que são os presidentes das colônias. O levantamento de dados ocorreu por meio de entrevistas, observação direta, observação participante, documentos, filmes, fotografias, depoimentos, considerando também os atores da gestão pública da pesca local, estadual e federal. As entrevistas foram examinadas com base em análise textual. A abordagem da pesquisa é qualitativa. A pesca artesanal praticada na localidade é de pequena escala, se utiliza de embarcações miúdas, o trabalho ocorre em regime de companha e/ou familiar. O co-manejo é a metodologia de gestão dos recursos comuns mais utilizadas na atualidade na pesca artesanal. Os dados revelaram que o modelo de co-manejo da CTP não é o mais adequado para mediar os conflitos da pesca na localidade. Este estudo constatou que existe a união das colônias por meio da CTP, mas mesmo assim, o mecanismo CTP, não permite que maiores conquistas sejam alcançadas pelos pescadores, tendo em vista que o modelo de co-manejo é apenas consultivo, em que o poder público consulta, mas toma a decisão de forma autônoma sem compartilhar o poder de gestão, desse modo não há o empoderamento por parte dos pescadores. Assim, se faz urgente a substituição do sistema de co-manejo exercido pela CTP, por outro que possibilite maior participação dos pescadores e não só das suas representações; autonomia de gestão dos pescadores; possibilidade de financiamento além das atividades de preservação, mas também de desenvolvimento econômico da pesca. Outros modelos de co-gestão passíveis de substituir a CTP são a Reserva Extrativista (RESEX), a Reserva de Desenvolvimento Sustentável ou Fórum de Pesca tendo em vista serem esses os modelos de co-manejo mais bem sucedidos no país e inclusive em parte da região, a RESEX de Arraial do Cabo. A constatação desta pesquisa do papel exercido pela CTP no que tange o co-manejo na LA é compatível com as deficiências dos modelos de co-manejo da pesca no Brasil, com menor empoderamento dos pescadores diante do poder público e da própria submissão da gestão pública da pesca diante da gestão pública do ambiente. Apesar de haver certa produção acadêmica sobre a pesca, a literatura sobre a gestão da pesca na localidade da LA é escassa, o que dificulta o desenvolvimento da sustentabilidade pesqueira e da aplicação de qualquer plano de manejo e indica a urgência do desenvolvimento de maiores investigações, no que este trabalho procura oferecer alguma contribuição.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Bayesian procedure for the retrieval of wind vectors over the ocean using satellite borne scatterometers requires realistic prior near-surface wind field models over the oceans. We have implemented carefully chosen vector Gaussian Process models; however in some cases these models are too smooth to reproduce real atmospheric features, such as fronts. At the scale of the scatterometer observations, fronts appear as discontinuities in wind direction. Due to the nature of the retrieval problem a simple discontinuity model is not feasible, and hence we have developed a constrained discontinuity vector Gaussian Process model which ensures realistic fronts. We describe the generative model and show how to compute the data likelihood given the model. We show the results of inference using the model with Markov Chain Monte Carlo methods on both synthetic and real data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

thesis is developed from a real life application of performance evaluation of small and medium-sized enterprises (SMEs) in Vietnam. The thesis presents two main methodological developments on evaluation of dichotomous environment variable impacts on technical efficiency. Taking into account the selection bias the thesis proposes a revised frontier separation approach for the seminal Data Envelopment Analysis (DEA) model which was developed by Charnes, Cooper, and Rhodes (1981). The revised frontier separation approach is based on a nearest neighbour propensity score matching pairing treated SMEs with their counterfactuals on the propensity score. The thesis develops order-m frontier conditioning on propensity score from the conditional order-m approach proposed by Cazals, Florens, and Simar (2002), advocated by Daraio and Simar (2005). By this development, the thesis allows the application of the conditional order-m approach with a dichotomous environment variable taking into account the existence of the self-selection problem of impact evaluation. Monte Carlo style simulations have been built to examine the effectiveness of the aforementioned developments. Methodological developments of the thesis are applied in empirical studies to evaluate the impact of training programmes on the performance of food processing SMEs and the impact of exporting on technical efficiency of textile and garment SMEs of Vietnam. The analysis shows that training programmes have no significant impact on the technical efficiency of food processing SMEs. Moreover, the analysis confirms the conclusion of the export literature that exporters are self selected into the sector. The thesis finds no significant impact from exporting activities on technical efficiency of textile and garment SMEs. However, large bias has been eliminated by the proposed approach. Results of empirical studies contribute to the understanding of the impact of different environmental variables on the performance of SMEs. It helps policy makers to design proper policy supporting the development of Vietnamese SMEs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Edge blur is an important perceptual cue, but how does the visual system encode the degree of blur at edges? Blur could be measured by the width of the luminance gradient profile, peak ^ trough separation in the 2nd derivative profile, or the ratio of 1st-to-3rd derivative magnitudes. In template models, the system would store a set of templates of different sizes and find which one best fits the `signature' of the edge. The signature could be the luminance profile itself, or one of its spatial derivatives. I tested these possibilities in blur-matching experiments. In a 2AFC staircase procedure, observers adjusted the blur of Gaussian edges (30% contrast) to match the perceived blur of various non-Gaussian test edges. In experiment 1, test stimuli were mixtures of 2 Gaussian edges (eg 10 and 30 min of arc blur) at the same location, while in experiment 2, test stimuli were formed from a blurred edge sharpened to different extents by a compressive transformation. Predictions of the various models were tested against the blur-matching data, but only one model was strongly supported. This was the template model, in which the input signature is the 2nd derivative of the luminance profile, and the templates are applied to this signature at the zero-crossings. The templates are Gaussian derivative receptive fields that covary in width and length to form a self-similar set (ie same shape, different sizes). This naturally predicts that shorter edges should look sharper. As edge length gets shorter, responses of longer templates drop more than shorter ones, and so the response distribution shifts towards shorter (smaller) templates, signalling a sharper edge. The data confirmed this, including the scale-invariance implied by self-similarity, and a good fit was obtained from templates with a length-to-width ratio of about 1. The simultaneous analysis of edge blur and edge location may offer a new solution to the multiscale problem in edge detection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Liquid-liquid extraction has long been known as a unit operation that plays an important role in industry. This process is well known for its complexity and sensitivity to operation conditions. This thesis presents an attempt to explore the dynamics and control of this process using a systematic approach and state of the art control system design techniques. The process was studied first experimentally under carefully selected. operation conditions, which resembles the ranges employed practically under stable and efficient conditions. Data were collected at steady state conditions using adequate sampling techniques for the dispersed and continuous phases as well as during the transients of the column with the aid of a computer-based online data logging system and online concentration analysis. A stagewise single stage backflow model was improved to mimic the dynamic operation of the column. The developed model accounts for the variation in hydrodynamics, mass transfer, and physical properties throughout the length of the column. End effects were treated by addition of stages at the column entrances. Two parameters were incorporated in the model namely; mass transfer weight factor to correct for the assumption of no mass transfer in the. settling zones at each stage and the backmixing coefficients to handle the axial dispersion phenomena encountered in the course of column operation. The parameters were estimated by minimizing the differences between the experimental and the model predicted concentration profiles at steady state conditions using non-linear optimisation technique. The estimated values were then correlated as functions of operating parameters and were incorporated in·the model equations. The model equations comprise a stiff differential~algebraic system. This system was solved using the GEAR ODE solver. The calculated concentration profiles were compared to those experimentally measured. A very good agreement of the two profiles was achieved within a percent relative error of ±2.S%. The developed rigorous dynamic model of the extraction column was used to derive linear time-invariant reduced-order models that relate the input variables (agitator speed, solvent feed flowrate and concentration, feed concentration and flowrate) to the output variables (raffinate concentration and extract concentration) using the asymptotic method of system identification. The reduced-order models were shown to be accurate in capturing the dynamic behaviour of the process with a maximum modelling prediction error of I %. The simplicity and accuracy of the derived reduced-order models allow for control system design and analysis of such complicated processes. The extraction column is a typical multivariable process with agitator speed and solvent feed flowrate considered as manipulative variables; raffinate concentration and extract concentration as controlled variables and the feeds concentration and feed flowrate as disturbance variables. The control system design of the extraction process was tackled as multi-loop decentralised SISO (Single Input Single Output) as well as centralised MIMO (Multi-Input Multi-Output) system using both conventional and model-based control techniques such as IMC (Internal Model Control) and MPC (Model Predictive Control). Control performance of each control scheme was. studied in terms of stability, speed of response, sensitivity to modelling errors (robustness), setpoint tracking capabilities and load rejection. For decentralised control, multiple loops were assigned to pair.each manipulated variable with each controlled variable according to the interaction analysis and other pairing criteria such as relative gain array (RGA), singular value analysis (SVD). Loops namely Rotor speed-Raffinate concentration and Solvent flowrate Extract concentration showed weak interaction. Multivariable MPC has shown more effective performance compared to other conventional techniques since it accounts for loops interaction, time delays, and input-output variables constraints.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis investigates the properties of two trends or time series which formed a:part of the Co-Citation bibliometric model "X~Ray Crystallography and Protein Determination in 1978, 1980 and 1982". This model was one of several created for the 1983 ABRC Science Policy Study which aimed to test the utility of bibliometric models in a national science policy context. The outcome of the validation part of that study proved to be especially favourable concerning the utility of trend data, which purport to model the development of speciality areas in science over time. This assessment could have important implications for the use of such data in policy formulation. However one possible problem with the Science Policy Study's conclusions was that insufficient time was available in the study for an in-depth analysis of the data. The thesis aims to continue the validation begun in the ABRC study by providing a detailed.examination of the characteristics of the data contained in the Trends numbered 11 and 44 in the model. A novel methodology for the analysis of the properties of the trends with respect to their literature content is presented. This is followed by an assessment based on questionnaire and interview data, of the ability of Trend 44 to realistically model the historical development of the field of mobile genetic elements research over time, with respect to its scientific content and the activities of its community of researchers. The results of these various analyses are then used to evaluate the strenghts and weaknesses of a trend or time series approach to the modelling of the activities of scientifiic fields. A critical evaluation of the origins of the discovered strengths and weaknesses.in the assumptions underlying the techniques used to generate trends from co-citation data is provided. Possible improvements. to the modelling techniques are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reversed-pahse high-performance liquid chromatographic (HPLC) methods were developed for the assay of indomethacin, its decomposition products, ibuprofen and its (tetrahydro-2-furanyl)methyl-, (tetrahydro-2-(2H)pyranyl)methyl- and cyclohexylmethyl esters. The development and application of these HPLC systems were studied. A number of physico-chemical parameters that affect percutaneous absorption were investigated. The pKa values of indomethacin and ibuprofen were determined using the solubility method. Potentiometric titration and the Taft equation were also used for ibuprofen. The incorporation of ethanol or propylene glycol in the solvent resulted in an improvement in the aqueous solubility of these compounds. The partition coefficients were evaluated in order to establish the affinity of these drugs towards the stratum corneum. The stability of indomethacin and of ibuprofen esters were investigated and the effect of temperature and pH on the decomposition rates were studied. The effect of cetyltrimethylammonium bromide on the alkaline degradation of indomethacin was also followed. In the presence of alcohol, indomethacin alcoholysis was observed and the kinetics of decomposition were subjected to non-linear regression analysis and the rate constants for the various pathways were quantified. The non-isothermal, sufactant non-isoconcentration and non-isopH degradation of indomethacin were investigated. The analysis of the data was undertaken using NONISO, a BASIC computer program. The degradation profiles obtained from both non-iso and iso-kinetic studies show that there is close concordance in the results. The metabolic biotransformation of ibuprofen esters was followed using esterases from hog liver and rat skin homogenates. The results showed that the esters were very labile under these conditions. The presence of propylene glycol affected the rates of enzymic hydrolysis of the ester. The hydrolysis is modelled using an equation involving the dielectric constant of the medium. The percutaneous absorption of indomethacin and of ibuprofen and its esters was followed from solutions using an in vitro excised human skin model. The absorption profiles followed first order kinetics. The diffusion process was related to their solubility and to the human skin/solvent partition coefficient. The percutaneous absorption of two ibuprofen esters from suspensions in 20% propylene glycol-water were also followed through rat skin with only ibuprofen being detected in the receiver phase. The sensitivity of ibuprofen esters to enzymic hydrolysis compared to the chemical hydrolysis may prove valuable in the formulation of topical delivery systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Strontium has been substituted for calcium in the glass series (SiO2)49.46(Na2O)26.38(P2O5)1.07(CaO)23.08x(SrO)x (where x = 0, 11.54, 23.08) to elucidate their underlying atomic-scale structural characteristics as a basis for understanding features related to the bioactivity. These bioactive glasses have been investigated using isomorphic neutron and X-ray diffraction, Sr K-edge EXAFS and solid state 17O, 23Na, 29Si, 31P and 43Ca magic-angle-spinning (MAS) NMR. An effective isomorphic substitution first-order difference function has been applied to the neutron diffraction data, confirming that Ca and Sr behave in a similar manner within the glass network, with residual differences attributed to solely the variation in ionic radius between the two species. The diffraction data provides the first direct experimental evidence of split Ca–O nearest-neighbour correlations in these melt quench bioactive glasses, together with an analogous splitting of the Sr–O correlations; the correlations are attributed to the metal ions correlated either to bridging or to non-bridging oxygen atoms. Triple quantum (3Q) 43Ca MAS NMR corroborates the split Ca–O correlations. Successful simplification of the 2 < r (A) < 3 region via the difference method has also revealed two distinct Na environments. These environments are attributed to sodium correlated either to bridging or to nonbridging oxygen atoms. Complementary multinuclear MAS NMR, Sr K-edge EXAFS and X-ray diffraction data supports the structural model presented. The structural sites present will be intimately related to their release properties in physiological fluids such as plasma and saliva, and hence the bioactivity of the material. Detailed structural knowledge is therefore a prerequisite for optimising material design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a stochastic agent-based model for the distribution of personal incomes in a developing economy. We start with the assumption that incomes are determined both by individual labour and by stochastic effects of trading and investment. The income from personal effort alone is distributed about a mean, while the income from trade, which may be positive or negative, is proportional to the trader's income. These assumptions lead to a Langevin model with multiplicative noise, from which we derive a Fokker-Planck (FP) equation for the income probability density function (IPDF) and its variation in time. We find that high earners have a power law income distribution while the low-income groups have a Levy IPDF. Comparing our analysis with the Indian survey data (obtained from the world bank website: http://go.worldbank.org/SWGZB45DN0) taken over many years we obtain a near-perfect data collapse onto our model's equilibrium IPDF. Using survey data to relate the IPDF to actual food consumption we define a poverty index (Sen A. K., Econometrica., 44 (1976) 219; Kakwani N. C., Econometrica, 48 (1980) 437), which is consistent with traditional indices, but independent of an arbitrarily chosen "poverty line" and therefore less susceptible to manipulation. Copyright © EPLA, 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rationale for carrying out this research was to address the clear lack of knowledge surrounding the measurement of public hospital performance in Ireland. The objectives of this research were to develop a comprehensive model for measuring hospital performance and using this model to measure the performance of public acute hospitals in Ireland in 2007. Having assessed the advantages and disadvantages of various measurement models the Data Envelopment Analysis (DEA) model was chosen for this research. DEA was initiated by Charnes, Cooper and Rhodes in 1978 and further developed by Fare et al. (1983) and Banker et al. (1984). The method used to choose relevant inputs and outputs to be included in the model followed that adopted by Casu et al. (2005) which included the use of focus groups. The main conclusions of the research are threefold. Firstly, it is clear that each stakeholder group has differing opinions on what constitutes good performance. It is therefore imperative that any performance measurement model would be designed within parameters that are clearly understood by any intended audience. Secondly, there is a lack of publicly available qualitative information in Ireland that inhibits detailed analysis of hospital performance. Thirdly, based on available qualitative and quantitative data the results indicated a high level of efficiency among the public acute hospitals in Ireland in their staffing and non pay costs, averaging 98.5%. As DEA scores are sensitive to the number of input and output variables as well as the size of the sample it should be borne in mind that a high level of efficiency could be as a result of using DEA with too many variables compared to the number of hospitals. No hospital was deemed to be scale efficient in any of the models even though the average scale efficiency for all of the hospitals was relatively high at 90.3%. Arising from this research the main recommendations would be that information on medical outcomes, survival rates and patient satisfaction should be made publicly available in Ireland; that despite a high average efficiency level that many individual hospitals need to focus on improving their technical and scale efficiencies, and that performance measurement models should be developed that would include more qualitative data.