847 resultados para Discrete Regression and Qualitative Choice Models
Resumo:
Se propone un planteamiento teórico/conceptual para determinar si las relaciones interorganizativas e interpersonales de la netchain de las cooperativas agroalimentarias evolucionan hacia una learning netchain. Las propuestas del trabajo muestran que el mayor grado de asociacionismo y la mayor cooperación/colaboración vertical a lo largo de la cadena están positivamente relacionados con la posición horizontal de la empresa focal más cercana del consumidor final. Esto requiere una planificación y una resolución de problemas de manera conjunta, lo que está positivamente relacionado con el mayor flujo y diversidad de la información/conocimiento obtenido y diseminado a lo largo de la netchain. Al mismo tiempo se necesita desarrollar un contexto social en el que fluya la información/conocimiento y las nuevas ideas de manera informal y esto se logra con redes personales y, principalmente, profesionales y con redes internas y, principalmente, externas. Todo esto permitirá una mayor satisfacción de los socios de la cooperativa agroalimentaria y de sus distribuidores y una mayor intensidad en I+D, convirtiéndose la netchain de la cooperativa agroalimentaria, así, en una learning netchain.
Resumo:
Raised bog peat deposits form important archives for reconstructing past changes in climate. Precise and reliable age models are of vital importance for interpreting such archives. We propose enhanced, Markov chain Monte Carlo based methods for obtaining age models from radiocarbon-dated peat cores, based on the assumption of piecewise linear accumulation. Included are automatic choice of sections, a measure of the goodness of fit and outlier downweighting. The approach is illustrated by using a peat core from the Netherlands.
Resumo:
The use of microbeam approaches has been a major advance in probing the relevance of bystander and adaptive responses in cell and tissue models. Our own studies at the Gray Cancer Institute have used both a charged particle microbeam, producing protons and helium ions and a soft X-ray microprobe, delivering focused carbon-K, aluminium-K and titanium-K soft X-rays. Using these techniques we have been able to build up a comprehensive picture of the underlying differences between bystander responses and direct effects in cell and tissue-like models. What is now clear is that bystander dose-response relationships, the underlying mechanisms of action and the targets involved are not the same as those observed for direct irradiation of DNA in the nucleus. Our recent studies have shown bystander responses even when radiation is deposited away from the nucleus in cytoplasmic targets. Also the interaction between bystander and adaptive responses may be a complex one related to dose, number of cells targeted and time interval.
Resumo:
In studies of radiation-induced DNA fragmentation and repair, analytical models may provide rapid and easy-to-use methods to test simple hypotheses regarding the breakage and rejoining mechanisms involved. The random breakage model, according to which lesions are distributed uniformly and independently of each other along the DNA, has been the model most used to describe spatial distribution of radiation-induced DNA damage. Recently several mechanistic approaches have been proposed that model clustered damage to DNA. In general, such approaches focus on the study of initial radiation-induced DNA damage and repair, without considering the effects of additional (unwanted and unavoidable) fragmentation that may take place during the experimental procedures. While most approaches, including measurement of total DNA mass below a specified value, allow for the occurrence of background experimental damage by means of simple subtractive procedures, a more detailed analysis of DNA fragmentation necessitates a more accurate treatment. We have developed a new, relatively simple model of DNA breakage and the resulting rejoining kinetics of broken fragments. Initial radiation-induced DNA damage is simulated using a clustered breakage approach, with three free parameters: the number of independently located clusters, each containing several DNA double-strand breaks (DSBs), the average number of DSBs within a cluster (multiplicity of the cluster), and the maximum allowed radius within which DSBs belonging to the same cluster are distributed. Random breakage is simulated as a special case of the DSB clustering procedure. When the model is applied to the analysis of DNA fragmentation as measured with pulsed-field gel electrophoresis (PFGE), the hypothesis that DSBs in proximity rejoin at a different rate from that of sparse isolated breaks can be tested, since the kinetics of rejoining of fragments of varying size may be followed by means of computer simulations. The problem of how to account for background damage from experimental handling is also carefully considered. We have shown that the conventional procedure of subtracting the background damage from the experimental data may lead to erroneous conclusions during the analysis of both initial fragmentation and DSB rejoining. Despite its relative simplicity, the method presented allows both the quantitative and qualitative description of radiation-induced DNA fragmentation and subsequent rejoining of double-stranded DNA fragments. (C) 2004 by Radiation Research Society.
Resumo:
The relationships among organisms and their surroundings can be of immense complexity. To describe and understand an ecosystem as a tangled bank, multiple ways of interaction and their effects have to be considered, such as predation, competition, mutualism and facilitation. Understanding the resulting interaction networks is a challenge in changing environments, e.g. to predict knock-on effects of invasive species and to understand how climate change impacts biodiversity. The elucidation of complex ecological systems with their interactions will benefit enormously from the development of new machine learning tools that aim to infer the structure of interaction networks from field data. In the present study, we propose a novel Bayesian regression and multiple changepoint model (BRAM) for reconstructing species interaction networks from observed species distributions. The model has been devised to allow robust inference in the presence of spatial autocorrelation and distributional heterogeneity. We have evaluated the model on simulated data that combines a trophic niche model with a stochastic population model on a 2-dimensional lattice, and we have compared the performance of our model with L1-penalized sparse regression (LASSO) and non-linear Bayesian networks with the BDe scoring scheme. In addition, we have applied our method to plant ground coverage data from the western shore of the Outer Hebrides with the objective to infer the ecological interactions. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Waste management and sustainability are two core underlying philosophies that the construction sector must acknowledge and implement; however, this can prove difficult and time consuming. To this end, the aim of this paper is to examine waste management strategies and the possible benefits, advantages and disadvantages to their introduction and use, while also to examine any inter-relationship with sustainability, particularly at the design stage. The purpose of this paper is to gather, examine and review published works and investigate factors which influence economic decisions at the design phase of a construction project. In addressing this aim, a three tiered sequential research approach is adopted; in-depth literature review, interviews/focus groups and qualitative analysis. The resulting data is analyzed, discussed, with potential conclusions identified; paying particular attention to implications for practice within architectural firms. This research is of importance, particularly to the architectural sector, as it can add to the industry’s understanding of the design process, while also considering the application and integration of waste management into the design procedure. Results indicate that the researched topic had many advantages but also had inherent disadvantages. It was found that the potential advantages outweighed disadvantages, but uptake within industry was still slow and that better promotion and their benefits to; sustainability, the environment, society and the industry were required.
Resumo:
In this paper we compare a number of the classical models used to characterize fading in body area networks (BANs) with the recently proposed shadowed ț–ȝ fading model. In particular, we focus on BAN channels which are considered to be susceptible to shadowing by the human body. The measurements considered in this study were conducted at 2.45 GHz for hypothetical BAN channels operating in both anechoic and highly reverberant environments while the person was moving. Compared to the Rice, Nakagami and lognormal fading models, it was found that the recently proposed shadowed țμ fading model provided an enhanced fit to the measured data.
Resumo:
This study is the first to compare random regret minimisation (RRM) and random utility maximisation (RUM) in freight transport application. This paper aims to compare RRM and RUM in a freight transport scenario involving negative shock in the reference alternative. Based on data from two stated choice experiments conducted among Swiss logistics managers, this study contributes to related literature by exploring for the first time the use of mixed logit models in the most recent version of the RRM approach. We further investigate two paradigm choices by computing elasticities and forecasting choice probability. We find that regret is important in describing the managers’ choices. Regret increases in the shock scenario, supporting the idea that a shift in reference point can cause a shift towards regret minimisation. Differences in elasticities and forecast probability are identified and discussed appropriately.
Resumo:
The use of handheld near infrared (NIR) instrumentation, as a tool for rapid analysis, has the potential to be used widely in the animal feed sector. A comparison was made between handheld NIR and benchtop instruments in terms of proximate analysis of poultry feed using off-the-shelf calibration models and including statistical analysis. Additionally, melamine adulterated soya bean products were used to develop qualitative and quantitative calibration models from the NIRS spectral data with excellent calibration models and prediction statistics obtained. With regards to the quantitative approach, the coefficients of determination (R2) were found to be 0.94-0.99 with the corresponding values for the root mean square error of calibration and prediction were found to be 0.081-0.215 % and 0.095-0.288 % respectively. In addition, cross validation was used to further validate the models with the root mean square error of cross validation found to be 0.101-0.212 %. Furthermore, by adopting a qualitative approach with the spectral data and applying Principal Component Analysis, it was possible to discriminate between adulterated and pure samples.
Resumo:
Background: Most recently fertility issues in HIV positive men and women are becoming increasingly important. Because of ART access and its good life effect, it is expected that the need and desire to get married, to have children and to have sexual partners for PLWHA would change with the regard to reproductive health. In Ethiopia HIV positive individuals may or may not have desire to have children. And the extent of this desire and how it varies by individual, health and demographic characteristics is not well known.
Objective: the aim of the study was to assess desire for fertility and associated factors among PLWHA in selected ART clinics of Horro Guduru Wollega Zone, Oromia National Regional State, Ethiopia.
Methods: A cross-sectional, institutional-based study that employed quantitative and qualitative in-depth interviews was conducted. Three hundred twenty one study subjects were selected using systematic random sampling technique and the data was collected using interviewer administered structured questionnaire. Data entry and analysis were performed using EPI Info version 3.5.1 and SPSS version 16. P-value <0.05 was taken as statistically significant and logistic regression was used to control potential confounding factors.
Results: Seventy three (57.9%) of the males and seventy six (39%) of the females desired to have children, giving a total of 149(46.4%) of all study participants. PLWHA who desired children were younger (AOR:3.3, 95%CI: 1.3-8.9), married (AOR: 5.8, 95%CI: 2.7-12.8), had no children (AOR: 75, 95%CI: 20.1-273.3) and males (AOR; 1.9, 95%CI: 1.02-3.62) compared with their counter parts. The major reason for those people who did not desire children were having desired number of children 80 (46.5%) followed by fear of HIV transmission to child reported by 42 (24.4%) of them.
Conclusion: A considerable number of PLWHA wants to have a child currently or in the near future. Many variables like socio demography, partner related, number of alive children and HIV related disease condition were significantly associated with fertility desire.
Resumo:
Semi-qualitative probabilistic networks (SQPNs) merge two important graphical model formalisms: Bayesian networks and qualitative probabilistic networks. They provide a very general modeling framework by allowing the combination of numeric and qualitative assessments over a discrete domain, and can be compactly encoded by exploiting the same factorization of joint probability distributions that are behind the Bayesian networks. This paper explores the computational complexity of semi-qualitative probabilistic networks, and takes the polytree-shaped networks as its main target. We show that the inference problem is coNP-Complete for binary polytrees with multiple observed nodes. We also show that inferences can be performed in linear time if there is a single observed node, which is a relevant practical case. Because our proof is constructive, we obtain an efficient linear time algorithm for SQPNs under such assumptions. To the best of our knowledge, this is the first exact polynomial-time algorithm for SQPNs. Together these results provide a clear picture of the inferential complexity in polytree-shaped SQPNs.
Resumo:
This paper explores semi-qualitative probabilistic networks (SQPNs) that combine numeric and qualitative information. We first show that exact inferences with SQPNs are NPPP-Complete. We then show that existing qualitative relations in SQPNs (plus probabilistic logic and imprecise assessments) can be dealt effectively through multilinear programming. We then discuss learning: we consider a maximum likelihood method that generates point estimates given a SQPN and empirical data, and we describe a Bayesian-minded method that employs the Imprecise Dirichlet Model to generate set-valued estimates.
Resumo:
Associations between socio-demographic and psychological factors and food choice patterns were explored in unemployed young people who constitute a vulnerable group at risk of poor dietary health. Volunteers (N = 168), male (n = 97) and female (n = 71), aged 15–25 years were recruited through United Kingdom (UK) community-based organisations serving young people not in education training or employment (NEET). Survey questionnaire enquired on food poverty, physical activity and measured responses to the Food Involvement Scale (FIS), Food Self-Efficacy Scale (FSS) and a 19-item Food Frequency Questionnaire (FFQ). A path analysis was undertaken to explore associations between age, gender, food poverty, age at leaving school, food self-efficacy (FS-E), food involvement (FI) (kitchen; uninvolved; enjoyment), physical activity and the four food choice patterns (junk food; healthy; fast food; high fat). FS-E was strong in the model and increased with age. FS-E was positively associated with more
frequent choice of healthy food and less frequent junk or high fat food (having controlled for age, gender and age at leaving school). FI (kitchen and enjoyment) increased with age. Higher FI (kitchen) was associated with less frequent junk food and fast food choice. Being uninvolved with food was associated with
more frequent fast food choice. Those who left school after the age of 16 years reported more frequent physical activity. Of the indirect effects, younger individuals had lower FI (kitchen) which led to frequent junk and fast food choice. Females who were older had higher FI (enjoyment) which led to less frequent fast food choice. Those who had left school before the age of 16 had low food involvement (uninvolved) which led to frequent junk food choice. Multiple indices implied that data were a good fit to the model which indicated a need to enhance food self-efficacy and encourage food involvement in order to improve dietary health among these disadvantaged young people.
Resumo:
Virtual metrology (VM) aims to predict metrology values using sensor data from production equipment and physical metrology values of preceding samples. VM is a promising technology for the semiconductor manufacturing industry as it can reduce the frequency of in-line metrology operations and provide supportive information for other operations such as fault detection, predictive maintenance and run-to-run control. Methods with minimal user intervention are required to perform VM in a real-time industrial process. In this paper we propose extreme learning machines (ELM) as a competitive alternative to popular methods like lasso and ridge regression for developing VM models. In addition, we propose a new way to choose the hidden layer weights of ELMs that leads to an improvement in its prediction performance.