877 resultados para Attitudes, Persuasion, Confidence, Voice, Elaboration Likelihood Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

When the data consist of certain attributes measured on the same set of items in different situations, they would be described as a three-mode three-way array. A mixture likelihood approach can be implemented to cluster the items (i.e., one of the modes) on the basis of both of the other modes simultaneously (i.e,, the attributes measured in different situations). In this paper, it is shown that this approach can be extended to handle three-mode three-way arrays where some of the data values are missing at random in the sense of Little and Rubin (1987). The methodology is illustrated by clustering the genotypes in a three-way soybean data set where various attributes were measured on genotypes grown in several environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many occupational safety interventions, the objective is to reduce the injury incidence as well as the mean claims cost once injury has occurred. The claims cost data within a period typically contain a large proportion of zero observations (no claim). The distribution thus comprises a point mass at 0 mixed with a non-degenerate parametric component. Essentially, the likelihood function can be factorized into two orthogonal components. These two components relate respectively to the effect of covariates on the incidence of claims and the magnitude of claims, given that claims are made. Furthermore, the longitudinal nature of the intervention inherently imposes some correlation among the observations. This paper introduces a zero-augmented gamma random effects model for analysing longitudinal data with many zeros. Adopting the generalized linear mixed model (GLMM) approach reduces the original problem to the fitting of two independent GLMMs. The method is applied to evaluate the effectiveness of a workplace risk assessment teams program, trialled within the cleaning services of a Western Australian public hospital.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Binning and truncation of data are common in data analysis and machine learning. This paper addresses the problem of fitting mixture densities to multivariate binned and truncated data. The EM approach proposed by McLachlan and Jones (Biometrics, 44: 2, 571-578, 1988) for the univariate case is generalized to multivariate measurements. The multivariate solution requires the evaluation of multidimensional integrals over each bin at each iteration of the EM procedure. Naive implementation of the procedure can lead to computationally inefficient results. To reduce the computational cost a number of straightforward numerical techniques are proposed. Results on simulated data indicate that the proposed methods can achieve significant computational gains with no loss in the accuracy of the final parameter estimates. Furthermore, experimental results suggest that with a sufficient number of bins and data points it is possible to estimate the true underlying density almost as well as if the data were not binned. The paper concludes with a brief description of an application of this approach to diagnosis of iron deficiency anemia, in the context of binned and truncated bivariate measurements of volume and hemoglobin concentration from an individual's red blood cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: This paper introduces the software EMMIX-GENE that has been developed for the specific purpose of a model-based approach to the clustering of microarray expression data, in particular, of tissue samples on a very large number of genes. The latter is a nonstandard problem in parametric cluster analysis because the dimension of the feature space (the number of genes) is typically much greater than the number of tissues. A feasible approach is provided by first selecting a subset of the genes relevant for the clustering of the tissue samples by fitting mixtures of t distributions to rank the genes in order of increasing size of the likelihood ratio statistic for the test of one versus two components in the mixture model. The imposition of a threshold on the likelihood ratio statistic used in conjunction with a threshold on the size of a cluster allows the selection of a relevant set of genes. However, even this reduced set of genes will usually be too large for a normal mixture model to be fitted directly to the tissues, and so the use of mixtures of factor analyzers is exploited to reduce effectively the dimension of the feature space of genes. Results: The usefulness of the EMMIX-GENE approach for the clustering of tissue samples is demonstrated on two well-known data sets on colon and leukaemia tissues. For both data sets, relevant subsets of the genes are able to be selected that reveal interesting clusterings of the tissues that are either consistent with the external classification of the tissues or with background and biological knowledge of these sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Germline mutations in the CDKN2A gene, which encodes two proteins (p16INK4A and p14ARF), are the most common cause of inherited susceptibility to melanoma. We examined the penetrance of such mutations using data from eight groups from Europe, Australia and the United States that are part of The Melanoma Genetics Consortium Methods: We analyzed 80 families with documented CDKN2A mutations and multiple cases of cutaneous melanoma. We modeled penetrance for melanoma using a logistic regression model incorporating survival analysis. Hypothesis testing was based on likelihood ratio tests. Covariates included gender, alterations in p14APF protein, and population melanoma incidence rates. All statistical tests were two-sided. Results: The 80 analyzed families contained 402 melanoma patients, 320 of whom were tested for mutations and 291 were mutation carriers. We also tested 713 unaffected family members for mutations and 194 were carriers. Overall, CDKN2A mutation penetrance was estimated to be 0.30 (95% confidence interval (CI) = 0.12 to 0.62) by age 50 years and 0.67 (95% CI = 0.31 to 0.96) by age 80 years. Penetrance was not statistically significantly modified by gender or by whether the CDKN2A mutation altered p14ARF protein. However, there was a statistically significant effect of residing in a location with a high population incidence rate of melanoma (P = .003). By age 50 years CDKN2A mutation penetrance reached 0.13 in Europe, 0.50 in the United States, and 0.32 in Australia; by age 80 years it was 0.58 in Europe, 0.76 in the United States, and 0.91 in Australia. Conclusions: This study, which gives the most informed estimates of CDKN2A mutation penetrance available, indicates that the penetrance varies with melanoma population incidence rates. Thus, the same factors that affect population incidence of melanoma may also mediate CDKN2A penetrance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A model of iron carbonate (FeCO3) film growth is proposed, which is an extension of the recent mechanistic model of carbon dioxide (CO2) corrosion by Nesic, et al. In the present model, the film growth occurs by precipitation of iron carbonate once saturation is exceeded. The kinetics of precipitation is dependent on temperature and local species concentrations that are calculated by solving the coupled species transport equations. Precipitation tends to build up a layer of FeCO3 on the surface of the steel and reduce the corrosion rate. On the other hand, the corrosion process induces voids under the precipitated film, thus increasing the porosity and leading to a higher corrosion rate. Depending on the environmental parameters such as temperature, pH, CO2 partial pressure, velocity, etc., the balance of the two processes can lead to a variety of outcomes. Very protective films and low corrosion rates are predicted at high pH, temperature, CO2 partial pressure, and Fe2+ ion concentration due to formation of dense protective films as expected. The model has been successfully calibrated against limited experimental data. Parametric testing of the model has been done to gain insight into the effect of various environmental parameters on iron carbonate film formation. The trends shown in the predictions agreed well with the general understanding of the CO2 corrosion process in the presence of iron carbonate films. The present model confirms that the concept of scaling tendency is a good tool for predicting the likelihood of protective iron carbonate film formation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with an n-fold Weibull competing risk model. A characterisation of the WPP plot is given along with estimation of model parameters when modelling a given data set. These are illustrated through two examples. A study of the different possible shapes for the density and failure rate functions is also presented. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developed societies are currently facing severe demographic changes: the world population is ageing at an unprecedented rate. This demographic trend will be also followed by an increase of people with physical limitations. New challenges are being raised to the traditional health care systems, not only in Portugal, but also in all other European states. There is an urgent need to find solutions that allow extending the time people can live in their preferred environment by increasing their autonomy, self-confidence and mobility. AAL4ALL is a project currently being developed in cooperation with 34 Portuguese interdisciplinary partners, from industry to academia, R&D and social disciplines, which employs a novel conceptual approach through the development of an ecosystem of products and services for Ambient Assisted Living (AAL) associated to a business model and validated through large scale trial. This paper presents a comparative perspective of the needs and attitudes towards technology of the AAL users and caregivers identified in the analysis of a set of three different surveys: a users survey targeted at the Portuguese seniors and pre-seniors; an informal caregivers survey targeted at the family, friends and neighbours who provide care without any financial compensation; and a formal caregivers survey targeted at physicians, nurses,psychologists, social workers, and direct-care workers providing care to elders. The first results indicate that AAL solutions must be affordable,user friendly and have a true perceived benefit to their users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents a research work, the goal of which was to achieve a model for the evaluation of data quality in institutional websites of health units in a broad and balanced way. We have carried out a literature review of the available approaches for the evaluation of website content quality, in order to identify the most recurrent dimensions and the attributes, and we have also carried out a Delphi method process with experts in order to reach an adequate set of attributes and their respective weights for the measurement of content quality. The results obtained revealed a high level of consensus among the experts who participated in the Delphi process. On the other hand, the different statistical analysis and techniques implemented are robust and attach confidence to our results and consequent model obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past thirty years, a series of plans have been developed by successive Brazilian governments in a continuing effort to maximize the nation's resources for economic and social growth. This planning history has been quantitatively rich but qualitatively poor. The disjunction has stimulated Professor Mello e Souza to address himself to the problem of national planning and to offer some criticisms of Brazilian planning experience. Though political instability has obviously been a factor promoting discontinuity, his criticisms are aimed at the attitudes and strategic concepts which have sought to link planning to national goals and administration. He criticizes the fascination with techniques and plans to the exclusion of proper diagnosis of the socio-political reality, developing instruments to coordinate and carry out objectives, and creating an administrative structure centralized enough to make national decisions and decentralized enough to perform on the basis of those decisions. Thus, fixed, quantified objectives abound while the problem of functioning mechanisms for the coordinated, rational use of resources has been left unattended. Although his interest and criticism are focused on the process and experience of national planning, he recognized variation in the level and results of Brazilian planning. National plans have failed due to faulty conception of the function of planning. Sectorial plans, save in the sector of the petroleum industry under government responsibility, ha e not succeeded in overcoming the problems of formulation and execution thereby repeating old technical errors. Planning for the private sector has a somewhat brighter history due to the use of Grupos Executivos which has enabled the planning process to transcend the formalism and tradition-bound attitudes of the regular bureaucracy. Regional planning offers two relatively successful experiences, Sudene and the strategy of the regionally oriented autarchy. Thus, planning history in Brazil is not entirely black but a certain shade of grey. The major part of the article, however, is devoted to a descriptive analysis of the national planning experience. The plans included in this analysis are: The Works and Equipment Plan (POE); The Health, Food, Transportation and Energy Plan (Salte); The Program of Goals; The Trienal Plan of Economic and Social Development; and the Plan of Governmental Economic Action (Paeg). Using these five plans for his historical experience the author sets out a series of errors of formulation and execution by which he analyzes that experience. With respect to formulation, he speaks of a lack of elaboration of programs and projects, of coordination among diverse goals, and of provision of qualified staff and techniques. He mentions the absence of the definition of resources necessary to the financing of the plan and the inadequate quantification of sectorial and national goals due to the lack of reliable statistical information. Finally, he notes the failure to coordinate the annual budget with the multi-year plans. He sees the problems of execution as beginning in the absence of coordination between the various sectors of the public administration, the failure to develop an operative system of decentralization, the absence of any system of financial and fiscal control over execution, the difficulties imposed by the system of public accounting, and the absence of an adequate program of allocation for the liberation of resources. He ends by pointing to the failure to develop and use an integrated system of political economic tools in a mode compatible with the objective of the plans. The body of the article analyzes national planning experience in Brazil using these lists of errors as rough model of criticism. Several conclusions emerge from this analysis with regard to planning in Brazil and in developing countries, in general. Plans have generally been of little avail in Brazil because of the lack of a continuous, bureaucratized (in the Weberian sense) planning organization set in an instrumentally suitable administrative structure and based on thorough diagnoses of socio-economic conditions and problems. Plans have become the justification for planning. Planning has come to be conceived as a rational method of orienting the process of decisions through the establishment of a precise and quantified relation between means and ends. But this conception has led to a planning history rimmed with frustration, and failure, because of its rigidity in the face of flexible and changing reality. Rather, he suggests a conception of planning which understands it "as a rational process of formulating decisions about the policy, economy, and society whose only demand is that of managing the instrumentarium in a harmonious and integrated form in order to reach explicit, but not quantified ends". He calls this "planning without plans": the establishment of broad-scale tendencies through diagnosis whose implementation is carried out through an adjustable, coherent instrumentarium of political-economic tools. Administration according to a plan of multiple, integrated goals is a sound procedure if the nation's administrative machinery contains the technical development needed to control the multiple variables linked to any situation of socio-economic change. Brazil does not possess this level of refinement and any strategy of planning relevant to its problems must recognize this. The reforms which have been attempted fail to make this recognition as is true of the conception of planning informing the Brazilian experience. Therefore, unworkable plans, ill-diagnosed with little or no supportive instrumentarium or flexibility have been Brazil's legacy. This legacy seems likely to continue until the conception of planning comes to live in the reality of Brazil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Value has been defined in different theoretical contexts as need, desire, interest, standard /criteria, beliefs, attitudes, and preferences. The creation of value is key to any business, and any business activity is about exchanging some tangible and/or intangible good or service and having its value accepted and rewarded by customers or clients, either inside the enterprise or collaborative network or outside. “Perhaps surprising then is that firms often do not know how to define value, or how to measure it” (Anderson and Narus, 1998 cited by [1]). Woodruff echoed that we need “richer customer value theory” for providing an “important tool for locking onto the critical things that managers need to know”. In addition, he emphasized, “we need customer value theory that delves deeply into customer’s world of product use in their situations” [2]. In this sense, we proposed and validated a novel “Conceptual Model for Decomposing the Value for the Customer”. To this end, we were aware that time has a direct impact on customer perceived value, and the suppliers’ and customers’ perceptions change from the pre-purchase to the post-purchase phases, causing some uncertainty and doubts.We wanted to break down value into all its components, as well as every built and used assets (both endogenous and/or exogenous perspectives). This component analysis was then transposed into a mathematical formulation using the Fuzzy Analytic Hierarchy Process (AHP), so that the uncertainty and vagueness of value perceptions could be embedded in this model that relates used and built assets in the tangible and intangible deliverable exchange among the involved parties, with their actual value perceptions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The species abundance distribution (SAD) has been a central focus of community ecology for over fifty years, and is currently the subject of widespread renewed interest. The gambin model has recently been proposed as a model that provides a superior fit to commonly preferred SAD models. It has also been argued that the model's single parameter (α) presents a potentially informative ecological diversity metric, because it summarises the shape of the SAD in a single number. Despite this potential, few empirical tests of the model have been undertaken, perhaps because the necessary methods and software for fitting the model have not existed. Here, we derive a maximum likelihood method to fit the model, and use it to undertake a comprehensive comparative analysis of the fit of the gambin model. The functions and computational code to fit the model are incorporated in a newly developed free-to-download R package (gambin). We test the gambin model using a variety of datasets and compare the fit of the gambin model to fits obtained using the Poisson lognormal, logseries and zero-sum multinomial distributions. We found that gambin almost universally provided a better fit to the data and that the fit was consistent for a variety of sample grain sizes. We demonstrate how α can be used to differentiate intelligibly between community structures of Azorean arthropods sampled in different land use types. We conclude that gambin presents a flexible model capable of fitting a wide variety of observed SAD data, while providing a useful index of SAD form in its single fitted parameter. As such, gambin has wide potential applicability in the study of SADs, and ecology more generally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este artigo é uma introdução à teoria do paradigma desconstrutivo de aprendizagem cooperativa. Centenas de estudos provam com evidências o facto de que as estruturas e os processos de aprendizagem cooperativa aumentam o desempenho académico, reforçam as competências de aprendizagem ao longo da vida e desenvolvem competências sociais, pessoais de cada aluno de uma forma mais eficaz e usta, comparativamente às estruturas tradicionais de aprendizagem nas escolas. Enfrentando os desafios dos nossos sistemas educativos, seria interessante elaborar o quadro teórico do discurso da aprendizagem cooperativa, dos últimos 40 anos, a partir de um aspeto prático dentro do contexto teórico e metodológico. Nas últimas décadas, o discurso cooperativo elaborou os elementos práticos e teóricos de estruturas e processos de aprendizagem cooperativa. Gostaríamos de fazer um resumo desses elementos com o objetivo de compreender que tipo de mudanças estruturais podem fazer diferenças reais na prática de ensino e aprendizagem. Os princípios básicos de estruturas cooperativas, os papéis de cooperação e as atitudes cooperativas são os principais elementos que podemos brevemente descrever aqui, de modo a criar um quadro para a compreensão teórica e prática de como podemos sugerir os elementos de aprendizagem cooperativa na nossa prática em sala de aula. Na minha perspetiva, esta complexa teoria da aprendizagem cooperativa pode ser entendida como um paradigma desconstrutivo que fornece algumas respostas pragmáticas para as questões da nossa prática educativa quotidiana, a partir do nível da sala de aula para o nível de sistema educativo, com foco na destruição de estruturas hierárquicas e antidemocráticas de aprendizagem e, criando, ao mesmo tempo, as estruturas cooperativas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To assess the prevalence and factors associated with intimate partner violence after the diagnosis of sexually transmitted diseases.METHODS This cross-sectional study was conducted in Fortaleza, CE, Northeastern Brazil, in 2012 and involved 221 individuals (40.3% male and 59.7% female) attended to at reference health care units for the treatment of sexually transmitted diseases. Data were collected using a questionnaire applied during interviews with each participant. A multivariate analysis with a logistic regression model was conducted using the stepwise technique. Only the variables with a p value < 0.05 were included in the adjusted analysis. The odds ratio (OR) with 95% confidence interval (CI) was used as the measure of effect.RESULTS A total of 30.3% of the participants reported experiencing some type of violence (27.6%, psychological; 5.9%, physical; and 7.2%, sexual) after the diagnosis of sexually transmitted disease. In the multivariate analysis adjusted to assess intimate partner violence after the revelation of the diagnosis of sexually transmitted diseases, the following variables remained statistically significant: extramarital relations (OR = 3.72; 95%CI 1.91;7.26; p = 0.000), alcohol consumption by the partner (OR = 2.16; 95%CI 1.08;4.33; p = 0.026), history of violence prior to diagnosis (OR = 2.87; 95%CI 1.44;5.69; p = 0.003), and fear of disclosing the diagnosis to the partner (OR = 2.66; 95%CI 1.32;5.32; p = 0.006).CONCLUSIONS Individuals who had extramarital relations, experienced violence prior to the diagnosis of sexually transmitted disease, feared disclosing the diagnosis to the partner, and those whose partner consumed alcohol had an increased likelihood of suffering violence. The high prevalence of intimate partner violence suggests that this population is vulnerable and therefore intervention efforts should be directed to them. Referral health care services for the treatment of sexually transmitted diseases can be strategic places to identify and prevent intimate partner violence.