907 resultados para Model selection criteria


Relevância:

90.00% 90.00%

Publicador:

Resumo:

O problema de selecção de fornecedores/parceiros é uma parte integrante e importante nas empresas que se propõem a um desempenho competitivo e lucrativo na sua área de actividade. A escolha do melhor fornecedor/parceiro passa na maior parte da vezes por fazer uma análise cuidada dos factores que podem influenciar positiva ou negativamente essa escolha. Desde cedo este problema tem vindo a ser alvo de inúmeros estudos, estudos esses que se focam essencialmente nos critérios a considerar e nas metodologias a adoptar para optimizar a escolha dos parceiros. De entre os vários estudos efectuados, muitos são os que consideram como critérios chave o custo do produto, a qualidade, a entrega e a reputação da empresa fornecedora. Ainda assim, há muitos outros que são referidos e que na sua maioria se apresentam como subcritérios. No âmbito deste trabalho, foram identificados cinco grandes critérios, Qualidade, Sistema Financeiro, Sinergias, Custo e Sistema Produtivo. Dentro desses critérios, sentiu-se a necessidade de incluir alguns subcritérios pelo que, cada um dos critérios chave apresenta cinco subcritérios. Identificados os critérios, foi necessário perceber de que forma são aplicados e que modelos são utilizados para se poder tirar o melhor partido das informações. Sabendo que existem modelos que privilegiam a programação matemática e outros que fazem uso de ponderações lineares para se identificar o melhor fornecedor, foi realizado um inquérito e contactadas empresas por forma a perceber quais os factores que mais peso tinham nas suas decisões de escolha de parceiros. Interpretados os resultados e tratados os dados foi adoptado um modelo de ponderação linear para traduzir a importância de cada um dos factores. O modelo proposto apresenta uma estrutura hierárquica e pode ser aplicado com o método AHP de Saaty ou o método de Análise de Valor. Este modelo permite escolher a ou as alternativas que melhor se adequam aos requisitos das empresas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cluster analysis for categorical data has been an active area of research. A well-known problem in this area is the determination of the number of clusters, which is unknown and must be inferred from the data. In order to estimate the number of clusters, one often resorts to information criteria, such as BIC (Bayesian information criterion), MML (minimum message length, proposed by Wallace and Boulton, 1968), and ICL (integrated classification likelihood). In this work, we adopt the approach developed by Figueiredo and Jain (2002) for clustering continuous data. They use an MML criterion to select the number of clusters and a variant of the EM algorithm to estimate the model parameters. This EM variant seamlessly integrates model estimation and selection in a single algorithm. For clustering categorical data, we assume a finite mixture of multinomial distributions and implement a new EM algorithm, following a previous version (Silvestre et al., 2008). Results obtained with synthetic datasets are encouraging. The main advantage of the proposed approach, when compared to the above referred criteria, is the speed of execution, which is especially relevant when dealing with large data sets.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A seleção de fornecedores é considerada atualmente estratégica para as empresas que estão inseridas em ambientes cada vez mais dinâmicos e exigentes. Nesta dissertação são determinados os critérios e métodos mais usados no problema de seleção de fornecedores. Para estes serem alcançados, analisaram-se artigos da área e de ilustres autores para assim se perceber quais os critérios das áreas mais influentes, na hora de tomada de decisão sobre os melhores fornecedores para as empresas. A partir deste estudo foi construído um inquérito de resposta curta, enviado a empresas a laborar em Portugal, para se obter as importâncias dadas aos critérios por parte das empresas. Com estas respostas conclui-se que critérios relacionados com a qualidade e o custo são os mais relevantes. Relativamente aos métodos, foram estudados teórica e praticamente, o AHP e o SMART. O primeiro por ser o mais referenciado nos artigos estudados e o segundo por ser o mais simples de implementar e usar. No SMART foram criadas as funções valor para regerem o funcionamento do método. Estas funções foram desenvolvidas de raiz, com base num estudo bibliográfico prévio para cada um dos subcritérios, para se entender qual o melhor tipo de função a aplicar definindo matematicamente melhor o comportamento de cada um deles. A tomada de decisão é bastante importante nas organizações, pois pode conduzir ao sucesso ou insucesso. Assim é explicado a envolvente da tomada de decisão, o problema da seleção dos fornecedores, como se desenvolve o processo de seleção e quais são os métodos existentes para auxiliar a escolha dos mesmos. Por fim é apresentado o modelo proposto baseado nos resultados obtidos através do inquérito, e a aplicação dos dois métodos (AHP e SMART) para um melhor entendimento dos mesmos.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: Intensive image surveillance after endovascular aneurysm repair is generally recommended due to continued risk of complications. However, patients at lower risk may not benefit from this strategy. We evaluated the predictive value of the first postoperative computed tomography angiography (CTA) characteristics for aneurysm-related adverse events as a means of patient selection for risk-adapted surveillance. METHODS: All patients treated with the Low-Permeability Excluder Endoprosthesis (W. L. Gore & Assoc, Flagstaff, Ariz) at a tertiary institution from 2004 to 2011 were included. First postoperative CTAs were analyzed for the presence of endoleaks, endograft kinking, distance from the lowermost renal artery to the start of the endograft, and for proximal and distal sealing length using center lumen line reconstructions. The primary end point was freedom from aneurysm-related adverse events. Multivariable Cox regression was used to test postoperative CTA characteristics as independent risk factors, which were subsequently used as selection criteria for low-risk and high-risk groups. Estimates for freedom from adverse events were obtained using Kaplan-Meier survival curves. RESULTS: Included were 131 patients. The median follow-up was 4.1 years (interquartile range, 2.1-6.1). During this period, 30 patients (23%) sustained aneurysm-related adverse events. Seal length <10 mm and presence of endoleak were significant risk factors for this end point. Patients were subsequently categorized as low-risk (proximal and distal seal length ≥10 mm and no endoleak, n = 62) or high-risk (seal length <10 mm or presence of endoleak, or both; n = 69). During follow-up, four low-risk patients (3%) and 26 high-risk patients (19%) sustained events (P < .001). Four secondary interventions were required in three low-risk patients, and 31 secondary interventions in 23 high-risk patients. Sac growth was observed in two low-risk patients and in 15 high-risk patients. The 5-year estimates for freedom from aneurysm-related adverse events were 98% for the low-risk group and 52% for the high-risk group. For each diagnosis, 81.7 image examinations were necessary in the low-risk group and 8.2 in the high-risk group. CONCLUSIONS: Our results suggest that the first postoperative CTA provides important information for risk stratification after endovascular aneurysm repair when the Excluder endoprosthesis is used. In patients with adequate seal and no endoleaks, the risk of aneurysm-related adverse events was significantly reduced, resulting in a large number of unnecessary image examinations. Adjusting the imaging protocol beyond 30 days and up to 5 years, based on individual patients' risk, may result in a more efficient and rational postoperative surveillance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

ABSTRACT Background Mental health promotion is supported by a strong body of knowledge and is a matter of public health with the potential of a large impact on society. Mental health promotion programs should be implemented as soon as possible in life, preferably starting during pregnancy. Programs should focus on malleable determinants, introducing strategies to reduce risk factors or their impact on mother and child, and also on strengthening protective factors to increase resilience. The ambition of early detecting risk situations requires the development and use of tools to assess risk, and the creation of a responsive network of services based in primary health care, especially maternal consultation during pregnancy and the first months of the born child. The number of risk factors and the way they interact and are buffered by protective factors are relevant for the final impact. Maternal-fetal attachment (MFA) is not yet a totally understood and well operationalized concept. Methodological problems limit the comparison of data as many studies used small size samples, had an exploratory character or used different selection criteria and different measures. There is still a lack of studies in high risk populations evaluating the consequences of a weak MFA. Instead, the available studies are not very conclusive, but suggest that social support, anxiety and depression, self-esteem and self-control and sense of coherence are correlated with MFA. MFA is also correlated with health practices during pregnancy, that influence pregnancy and baby outcomes. MFA seems a relevant concept for the future mother baby interaction, but more studies are needed to clarify the concept and its operationalization. Attachment is a strong scientific concept with multiple implications for future child development, personality and relationship with others. Secure attachment is considered an essential basis of good mental health, and promoting mother-baby interaction offers an excellent opportunity to intervention programmes targeted at enhancing mental health and well-being. Understanding the process of attachment and intervening to improve attachment requires a comprehension of more proximal factors, but also a broader approach that assesses the impact of more distal social conditions on attachment and how this social impact is mediated by family functioning and mother-baby interaction. Finally, it is essential to understand how this knowledge could be translated in effective mental health promoting interventions and measures that could reach large populations of pregnant mothers and families. Strengthening emotional availability (EA) seems to be a relevant approach to improve the mother-baby relationship. In this review we have offered evidence suggesting a range of determinants of mother-infant relationship, including age, marital relationship, social disadvantages, migration, parental psychiatric disorders and the situations of abuse or neglect. Based on this theoretical background we constructed a theoretical model that included proximal and distal factors, risk and protective factors, including variables related to the mother, the father, their social support and mother baby interaction from early pregnancy until six months after birth. We selected the Antenatal Psychosocial Health Assessment (ALPHA) for use as an instrument to detect psychosocial risk during pregnancy. Method Ninety two pregnant women were recruited from the Maternal Health Consultation in Primary Health Care (PHC) at Amadora. They had three moments of assessment: at T1 (until 12 weeks of pregnancy) they filed out a questionnaire that included socio-demographic data, ALPHA, Edinburgh post-natal Depression Scale (EDPS), General Health Questionnaire (GHQ) and Sense of Coherence (SOC); at T2 (after the 20th weeks of pregnancy) they answered EDPS, SOC and MFA Scale (MFAS), and finally at T3 (6 months after birth), they repeated EDPS and SOC, and their interaction with their babies was videotaped and later evaluated using EA Scales. A statistical analysis has been done using descriptive statistics, correlation analysis, univariate logistic regression and multiple linear regression. Results The study has increased our knowledge on this particular population living in a multicultural, suburb community. It allow us to identify specific groups with a higher level of psychosocial risk, such as single or divorced women, young couples, mothers with a low level of education and those who are depressed or have a low SOC. The hypothesis that psychosocial risk is directly correlated with MFAS and that MFA is directly correlated with EA was not confirmed, neither the correlation between prenatal psychosocial risk and mother-baby EA. The study identified depression as a relevant risk factor in pregnancy and its higher prevalence in single or divorced women, immigrants and in those who have a higher global psychosocial risk. Depressed women have a poor MFA, and a lower structuring capacity and a higher hostility to their babies. In average, depression seems to reduce among pregnant women in the second part of their pregnancy. The children of immigrant mothers show a lower level of responsiveness to their mothers what could be transmitted through depression, as immigrant mothers have a higher risk of depression in the beginning of pregnancy and six months after birth. Young mothers have a low MFA and are more intrusive. Women who have a higher level of education are more sensitive and their babies showed to be more responsive. Women who are or have been submitted to abuse were found to have a higher level of MFA but their babies are less responsive to them. The study highlights the relevance of SOC as a potential protective factor while it is strongly and negatively related with a wide range of risk factors and mental health outcomes especially depression before, during and after pregnancy. Conclusions ALPHA proved to be a valid, feasible and reliable instrument to Primary Health Care (PHC) that can be used as a total sum score. We could not prove the association between psychosocial risk factors and MFA, neither between MFA and EA, or between psychosocial risk and EA. Depression and SOC seems to have a clear and opposite relevance on this process. Pregnancy can be considered as a maturational process and an opportunity to change, where adaptation processes occur, buffering risk, decreasing depression and increasing SOC. Further research is necessary to better understand interactions between variables and also to clarify a better operationalization of MFA. We recommend the use of ALPHA, SOC and EDPS in early pregnancy as a way of identifying more vulnerable women that will require additional interventions and support in order to decrease risk. At political level we recommend the reinforcement of Immigrant integration and the increment of education in women. We recommend more focus in health care and public health in mental health condition and psychosocial risk of specific groups at high risk. In PHC special attention should be paid to pregnant women who are single or divorced, very young, low educated and to immigrant mothers. This study provides the basis for an intervention programme for this population, that aims to reduce broad spectrum risk factors and to promote Mental Health in women who become pregnant. Health and mental health policies should facilitate the implementation of the suggested measures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A low-background inclusive search for new physics in events with same-sign dileptons is presented. The search uses proton--proton collisions corresponding to 20.3 fb−1 of integrated luminosity taken in 2012 at a centre-of-mass energy of 8 TeV with the ATLAS detector at the LHC. Pairs of isolated leptons with the same electric charge and large transverse momenta of the type e±e±,e±μ±, and μ±μ± are selected and their invariant mass distribution is examined. No excess of events above the expected level of Standard Model background is found. The results are used to set upper limits on the cross sections for processes beyond the Standard Model. Limits are placed as a function of the dilepton invariant mass within a fiducial region corresponding to the signal event selection criteria. Exclusion limits are also derived for a specific model of doubly charged Higgs boson production.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper develops stochastic search variable selection (SSVS) for zero-inflated count models which are commonly used in health economics. This allows for either model averaging or model selection in situations with many potential regressors. The proposed techniques are applied to a data set from Germany considering the demand for health care. A package for the free statistical software environment R is provided.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The evolution of key innovations, novel traits that promote diversification, is often seen as major driver for the unequal distribution of species richness within the tree of life. In this study, we aim to determine the factors underlying the extraordinary radiation of the subfamily Bromelioideae, one of the most diverse clades among the neotropical plant family Bromeliaceae. Based on an extended molecular phylogenetic data set, we examine the effect of two putative key innovations, that is, the Crassulacean acid metabolism (CAM) and the water-impounding tank, on speciation and extinction rates. To this aim, we develop a novel Bayesian implementation of the phylogenetic comparative method, binary state speciation and extinction, which enables hypotheses testing by Bayes factors and accommodates the uncertainty on model selection by Bayesian model averaging. Both CAM and tank habit were found to correlate with increased net diversification, thus fulfilling the criteria for key innovations. Our analyses further revealed that CAM photosynthesis is correlated with a twofold increase in speciation rate, whereas the evolution of the tank had primarily an effect on extinction rates that were found five times lower in tank-forming lineages compared to tank-less clades. These differences are discussed in the light of biogeography, ecology, and past climate change.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper considers the instrumental variable regression model when there is uncertainty about the set of instruments, exogeneity restrictions, the validity of identifying restrictions and the set of exogenous regressors. This uncertainty can result in a huge number of models. To avoid statistical problems associated with standard model selection procedures, we develop a reversible jump Markov chain Monte Carlo algorithm that allows us to do Bayesian model averaging. The algorithm is very exible and can be easily adapted to analyze any of the di¤erent priors that have been proposed in the Bayesian instrumental variables literature. We show how to calculate the probability of any relevant restriction (e.g. the posterior probability that over-identifying restrictions hold) and discuss diagnostic checking using the posterior distribution of discrepancy vectors. We illustrate our methods in a returns-to-schooling application.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There is a vast literature that specifies Bayesian shrinkage priors for vector autoregressions (VARs) of possibly large dimensions. In this paper I argue that many of these priors are not appropriate for multi-country settings, which motivates me to develop priors for panel VARs (PVARs). The parametric and semi-parametric priors I suggest not only perform valuable shrinkage in large dimensions, but also allow for soft clustering of variables or countries which are homogeneous. I discuss the implications of these new priors for modelling interdependencies and heterogeneities among different countries in a panel VAR setting. Monte Carlo evidence and an empirical forecasting exercise show clear and important gains of the new priors compared to existing popular priors for VARs and PVARs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE:: The purpose of this study was to assess outcomes and indications in a large cohort of patients who underwent liver transplantation (LT) for liver metastases (LM) from neuroendocrine tumors (NET) over a 27-year period. BACKGROUND:: LT for NET remains controversial due to the absence of clear selection criteria and the scarcity and heterogeneity of reported cases. METHODS:: This retrospective multicentric study included 213 patients who underwent LT for NET performed in 35 centers in 11 European countries between 1982 and 2009. One hundred seven patients underwent transplantation before 2000 and 106 after 2000. Mean age at the time of LT was 46 years. Half of the patients presented hormone secretion and 55% had hepatomegaly. Before LT, 83% of patients had undergone surgical treatment of the primary tumor and/or LM and 76% had received chemotherapy. The median interval between diagnosis of LM and LT was 25 months (range, 1-149 months). In addition to LT, 24 patients underwent major resection procedures and 30 patients underwent minor resection procedures. RESULTS:: Three-month postoperative mortality was 10%. At 5 years after LT, overall survival (OS) was 52% and disease-free survival was 30%. At 5 years from diagnosis of LM, OS was 73%. Multivariate analysis identified 3 predictors of poor outcome, that is, major resection in addition to LT, poor tumor differentiation, and hepatomegaly. Since 2000, 5-year OS has increased to 59% in relation with fewer patients presenting poor prognostic factors. Multivariate analysis of the 106 cases treated since 2000 identified the following predictors of poor outcome: hepatomegaly, age more than 45 years, and any amount of resection concurrent with LT. CONCLUSIONS:: LT is an effective treatment of unresectable LM from NET. Patient selection based on the aforementioned predictors can achieve a 5-year OS between 60% and 80%. However, use of overly restrictive criteria may deny LT to some patients who could benefit. Optimal timing for LT in patients with stable versus progressive disease remains unclear.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: To identify and prioritize improvement opportunities, according to the European Foundation for Quality Management model (EFQM) model, of the methadone dispensing service in Andalusian Primary Health Care, from the point of view of professionals. Method: Delphi consensus method, implemented from September 2007 to March 2008 by means of three rounds of interviews with questionnaires administered by electronic mail to 39 professionals. The Panel of experts was made up of Dispensers and Prescribers of methadone as well as Coordinators of welfare services from the Methadone Treatment Program (MTP). Selection criteria were: Being in active employment with a minimum of 3 years experience. Sample diversification variables: Professional role, geographical environment and type of habitat. Recruitment: By means of key professional bodies from different institutions. Results: 48 improvement opportunities were identified. Thirteen of these obtained a high level of agreement in the final round. According to the EFQM model, the dimensions that obtained the most consensus in relation to improving the care service were: Leadership, Alliances and Resources. The dimension that caused the greatest disagreement was Processes. Conclusions: In spite of its having been implemented since 1997 in Andalusian Primary Health Care, the methadone dispensing service is at an implementation phase, rather than what could be classed as a fully deployed stage

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a) real absences b) pseudo-absences selected randomly from the background and c) two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA) or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors) was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97), and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have limited fit.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We performed numerical simulations of DNA chains to understand how local geometry of juxtaposed segments in knotted DNA molecules can guide type II DNA topoisomerases to perform very efficient relaxation of DNA knots. We investigated how the various parameters defining the geometry of inter-segmental juxtapositions at sites of inter-segmental passage reactions mediated by type II DNA topoisomerases can affect the topological consequences of these reactions. We confirmed the hypothesis that by recognizing specific geometry of juxtaposed DNA segments in knotted DNA molecules, type II DNA topoisomerases can maintain the steady-state knotting level below the topological equilibrium. In addition, we revealed that a preference for a particular geometry of juxtaposed segments as sites of strand-passage reaction enables type II DNA topoisomerases to select the most efficient pathway of relaxation of complex DNA knots. The analysis of the best selection criteria for efficient relaxation of complex knots revealed that local structures in random configurations of a given knot type statistically behave as analogous local structures in ideal geometric configurations of the corresponding knot type.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Measuring school efficiency is a challenging task. First, a performance measurement technique has to be selected. Within Data Envelopment Analysis (DEA), one such technique, alternative models have been developed in order to deal with environmental variables. The majority of these models lead to diverging results. Second, the choice of input and output variables to be included in the efficiency analysis is often dictated by data availability. The choice of the variables remains an issue even when data is available. As a result, the choice of technique, model and variables is probably, and ultimately, a political judgement. Multi-criteria decision analysis methods can help the decision makers to select the most suitable model. The number of selection criteria should remain parsimonious and not be oriented towards the results of the models in order to avoid opportunistic behaviour. The selection criteria should also be backed by the literature or by an expert group. Once the most suitable model is identified, the principle of permanence of methods should be applied in order to avoid a change of practices over time. Within DEA, the two-stage model developed by Ray (1991) is the most convincing model which allows for an environmental adjustment. In this model, an efficiency analysis is conducted with DEA followed by an econometric analysis to explain the efficiency scores. An environmental variable of particular interest, tested in this thesis, consists of the fact that operations are held, for certain schools, on multiple sites. Results show that the fact of being located on more than one site has a negative influence on efficiency. A likely way to solve this negative influence would consist of improving the use of ICT in school management and teaching. Planning new schools should also consider the advantages of being located on a unique site, which allows reaching a critical size in terms of pupils and teachers. The fact that underprivileged pupils perform worse than privileged pupils has been public knowledge since Coleman et al. (1966). As a result, underprivileged pupils have a negative influence on school efficiency. This is confirmed by this thesis for the first time in Switzerland. Several countries have developed priority education policies in order to compensate for the negative impact of disadvantaged socioeconomic status on school performance. These policies have failed. As a result, other actions need to be taken. In order to define these actions, one has to identify the social-class differences which explain why disadvantaged children underperform. Childrearing and literary practices, health characteristics, housing stability and economic security influence pupil achievement. Rather than allocating more resources to schools, policymakers should therefore focus on related social policies. For instance, they could define pre-school, family, health, housing and benefits policies in order to improve the conditions for disadvantaged children.