856 resultados para Population set-based methods
Resumo:
Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.
Resumo:
BACKGROUND: HIV surveillance requires monitoring of new HIV diagnoses and differentiation of incident and older infections. In 2008, Switzerland implemented a system for monitoring incident HIV infections based on the results of a line immunoassay (Inno-Lia) mandatorily conducted for HIV confirmation and type differentiation (HIV-1, HIV-2) of all newly diagnosed patients. Based on this system, we assessed the proportion of incident HIV infection among newly diagnosed cases in Switzerland during 2008-2013. METHODS AND RESULTS: Inno-Lia antibody reaction patterns recorded in anonymous HIV notifications to the federal health authority were classified by 10 published algorithms into incident (up to 12 months) or older infections. Utilizing these data, annual incident infection estimates were obtained in two ways, (i) based on the diagnostic performance of the algorithms and utilizing the relationship 'incident = true incident + false incident', (ii) based on the window-periods of the algorithms and utilizing the relationship 'Prevalence = Incidence x Duration'. From 2008-2013, 3'851 HIV notifications were received. Adult HIV-1 infections amounted to 3'809 cases, and 3'636 of them (95.5%) contained Inno-Lia data. Incident infection totals calculated were similar for the performance- and window-based methods, amounting on average to 1'755 (95% confidence interval, 1588-1923) and 1'790 cases (95% CI, 1679-1900), respectively. More than half of these were among men who had sex with men. Both methods showed a continuous decline of annual incident infections 2008-2013, totaling -59.5% and -50.2%, respectively. The decline of incident infections continued even in 2012, when a 15% increase in HIV notifications had been observed. This increase was entirely due to older infections. Overall declines 2008-2013 were of similar extent among the major transmission groups. CONCLUSIONS: Inno-Lia based incident HIV-1 infection surveillance proved useful and reliable. It represents a free, additional public health benefit of the use of this relatively costly test for HIV confirmation and type differentiation.
Resumo:
X-ray medical imaging is increasingly becoming three-dimensional (3-D). The dose to the population and its management are of special concern in computed tomography (CT). Task-based methods with model observers to assess the dose-image quality trade-off are promising tools, but they still need to be validated for real volumetric images. The purpose of the present work is to evaluate anthropomorphic model observers in 3-D detection tasks for low-contrast CT images. We scanned a low-contrast phantom containing four types of signals at three dose levels and used two reconstruction algorithms. We implemented a multislice model observer based on the channelized Hotelling observer (msCHO) with anthropomorphic channels and investigated different internal noise methods. We found a good correlation for all tested model observers. These results suggest that the msCHO can be used as a relevant task-based method to evaluate low-contrast detection for CT and optimize scan protocols to lower dose in an efficient way.
Resumo:
L’aigua és un dels components bàsics per a la vida i una font d’exposició a contaminants ubiqua, ja que tota la població en consumeix. L’estudi epidemiològic INMA avaluarà si l’exposició a nitrats durant l’embaràs i a la duresa de l’aigua durant la infància es relaciona amb el baix pes al néixer i l’èczema atòpica, respectivament. Objectiu: Fer una avaluació dels nivells de nitrats i duresa de l’aigua en aigua de consum de la població de l’estudi INMA. Metodologia: l’estudi descriptiu realitzat a quatre de les set cohorts INMA, a Astúries, Guipúscoa, Sabadell i València. S’ha recopilat dades dels nivells de nitrats i duresa a l’aigua de consum dels municipis durant el període d’interès (2003 al 2008 i 2004 al 2012), a través d’ajuntaments i companyies d’aigua. S’ha calculat la mitjana, la desviació estàndard, el màxim i el mínim dels nivells de nitrat i de duresa en total i segons l’àrea geogràfica, l’any i l’estació. A Sabadell s’han fet tres mostrejos d’aigua per analitzar la duresa a diferents punts de la ciutat. Resultats: el nivell promig de nitrats (mg/L NO3-) és de 4,2 a Astúries, 4,0 a Guipúscoa, 9,2 a Sabadell i 15,2 a València. El nivell promig de duresa (mg/L CaCO3) és de 89,1 a Astúries, 132,7 al Guipúscoa, 178,3 a València i 230,9 a Sabadell. En l’anàlisi que es va realitzar a Sabadell, es detecta una duresa lleugerament inferior a la reportada sense variabilitat geogràfica. No s’observa una pauta clara de variabilitat estacional ni de variabilitat temporal tant per nitrats com per duresa. Conclusions: S’ha detectat variabilitat en els nivells de nitrats i duresa de l’aigua a les zones d’estudi. Els nivells de nitrats són moderats i els més alts es troben a zones agrícoles de València. La duresa de l’aigua és força alta degut al domini calcari dels subsòls de les zones d’estudi.
Resumo:
During the past few years, a considerable number of research articles have been published relating to the structure and function of the major photosynthetic protein complexes, photosystem (PS) I, PSII, cytochrome (Cyt) b6f, and adenosine triphosphate (ATP) synthase. Sequencing of the Arabidopsis thaliana (Arabidopsis) genome together with several high-quality proteomics studies has, however, revealed that the thylakoid membrane network of plant chloroplasts still contains a number of functionally unknown proteins. These proteins may have a role as auxiliary proteins guiding the assembly, maintenance, and turnover of the thylakoid protein complexes, or they may be as yet unknown subunits of the photosynthetic complexes. Novel subunits are most likely to be found in the NAD(P)H dehydrogenase (NDH) complex, the structure and function of which have remained obscure in the absence of detailed crystallographic data, thus making this thylakoid protein complex a particularly interesting target of investigation. In this thesis, several novel thylakoid-associated proteins were identified by proteomics-based methods. The major goal of characterization of the stroma thylakoid associated polysome-nascent chain complexes was to determine the proteins that guide the dynamic life cycle of PSII. In addition, a large protein complex of ≥ 1,000 kDa, residing in the stroma thylakoid, was characterized in greater depth and it was found to be a supercomplex composed of the PSI and NDH complexes. A set of newly identified proteins from Arabidopsis thylakoids was subjected to detailed characterization using the reverse genetics approach and extensive biochemical and biophysical analysis. The role of the novel proteins, either as auxiliary proteins or subunits of the photosynthetic protein complexes, was revealed. Two novel thylakoid lumen proteins, TLP18.3 and AtCYP38, function as auxiliary proteins assisting specific steps of the assembly/repair of PSII. The role of the 10-kDa thylakoid lumen protein PsbR is related to the optimization of oxygen evolution of PSII by assisting the assembly of the PsbP protein. Two integral thylakoid membrane proteins, NDH45 and NDH48, are novel subunits of the chloroplast NDH complex. Finally, the thylakoid lumen immunophilin AtCYP20-2 is suggested to interact with the NDH complex, instead of PSII as was hypothesized earlier.
Resumo:
Standard indirect Inference (II) estimators take a given finite-dimensional statistic, Z_{n} , and then estimate the parameters by matching the sample statistic with the model-implied population moment. We here propose a novel estimation method that utilizes all available information contained in the distribution of Z_{n} , not just its first moment. This is done by computing the likelihood of Z_{n}, and then estimating the parameters by either maximizing the likelihood or computing the posterior mean for a given prior of the parameters. These are referred to as the maximum indirect likelihood (MIL) and Bayesian Indirect Likelihood (BIL) estimators, respectively. We show that the IL estimators are first-order equivalent to the corresponding moment-based II estimator that employs the optimal weighting matrix. However, due to higher-order features of Z_{n} , the IL estimators are higher order efficient relative to the standard II estimator. The likelihood of Z_{n} will in general be unknown and so simulated versions of IL estimators are developed. Monte Carlo results for a structural auction model and a DSGE model show that the proposed estimators indeed have attractive finite sample properties.
Resumo:
Bakgrunden och inspirationen till föreliggande studie är tidigare forskning i tillämpningar på randidentifiering i metallindustrin. Effektiv randidentifiering möjliggör mindre säkerhetsmarginaler och längre serviceintervall för apparaturen i industriella högtemperaturprocesser, utan ökad risk för materielhaverier. I idealfallet vore en metod för randidentifiering baserad på uppföljning av någon indirekt variabel som kan mätas rutinmässigt eller till en ringa kostnad. En dylik variabel för smältugnar är temperaturen i olika positioner i väggen. Denna kan utnyttjas som insignal till en randidentifieringsmetod för att övervaka ugnens väggtjocklek. Vi ger en bakgrund och motivering till valet av den geometriskt endimensionella dynamiska modellen för randidentifiering, som diskuteras i arbetets senare del, framom en flerdimensionell geometrisk beskrivning. I de aktuella industriella tillämpningarna är dynamiken samt fördelarna med en enkel modellstruktur viktigare än exakt geometrisk beskrivning. Lösningsmetoder för den s.k. sidledes värmeledningsekvationen har många saker gemensamt med randidentifiering. Därför studerar vi egenskaper hos lösningarna till denna ekvation, inverkan av mätfel och något som brukar kallas förorening av mätbrus, regularisering och allmännare följder av icke-välställdheten hos sidledes värmeledningsekvationen. Vi studerar en uppsättning av tre olika metoder för randidentifiering, av vilka de två första är utvecklade från en strikt matematisk och den tredje från en mera tillämpad utgångspunkt. Metoderna har olika egenskaper med specifika fördelar och nackdelar. De rent matematiskt baserade metoderna karakteriseras av god noggrannhet och låg numerisk kostnad, dock till priset av låg flexibilitet i formuleringen av den modellbeskrivande partiella differentialekvationen. Den tredje, mera tillämpade, metoden kännetecknas av en sämre noggrannhet förorsakad av en högre grad av icke-välställdhet hos den mera flexibla modellen. För denna gjordes även en ansats till feluppskattning, som senare kunde observeras överensstämma med praktiska beräkningar med metoden. Studien kan anses vara en god startpunkt och matematisk bas för utveckling av industriella tillämpningar av randidentifiering, speciellt mot hantering av olinjära och diskontinuerliga materialegenskaper och plötsliga förändringar orsakade av “nedfallande” väggmaterial. Med de behandlade metoderna förefaller det möjligt att uppnå en robust, snabb och tillräckligt noggrann metod av begränsad komplexitet för randidentifiering.
Resumo:
The objective of the this research project is to develop a novel force control scheme for the teleoperation of a hydraulically driven manipulator, and to implement an ideal transparent mapping between human and machine interaction, and machine and task environment interaction. This master‘s thesis provides a preparatory study for the present research project. The research is limited into a single degree of freedom hydraulic slider with 6-DOF Phantom haptic device. The key contribution of the thesis is to set up the experimental rig including electromechanical haptic device, hydraulic servo and 6-DOF force sensor. The slider is firstly tested as a position servo by using previously developed intelligent switching control algorithm. Subsequently the teleoperated system is set up and the preliminary experiments are carried out. In addition to development of the single DOF experimental set up, methods such as passivity control in teleoperation are reviewed. The thesis also contains review of modeling of the servo slider in particular reference to the servo valve. Markov Chain Monte Carlo method is utilized in developing the robustness of the model in presence of noise.
Resumo:
Computational model-based simulation methods were developed for the modelling of bioaffinity assays. Bioaffinity-based methods are widely used to quantify a biological substance in biological research, development and in routine clinical in vitro diagnostics. Bioaffinity assays are based on the high affinity and structural specificity between the binding biomolecules. The simulation methods developed are based on the mechanistic assay model, which relies on the chemical reaction kinetics and describes the forming of a bound component as a function of time from the initial binding interaction. The simulation methods were focused on studying the behaviour and the reliability of bioaffinity assay and the possibilities the modelling methods of binding reaction kinetics provide, such as predicting assay results even before the binding reaction has reached equilibrium. For example, a rapid quantitative result from a clinical bioaffinity assay sample can be very significant, e.g. even the smallest elevation of a heart muscle marker reveals a cardiac injury. The simulation methods were used to identify critical error factors in rapid bioaffinity assays. A new kinetic calibration method was developed to calibrate a measurement system by kinetic measurement data utilizing only one standard concentration. A nodebased method was developed to model multi-component binding reactions, which have been a challenge to traditional numerical methods. The node-method was also used to model protein adsorption as an example of nonspecific binding of biomolecules. These methods have been compared with the experimental data from practice and can be utilized in in vitro diagnostics, drug discovery and in medical imaging.
Resumo:
In this paper, we review the advances of monocular model-based tracking for last ten years period until 2014. In 2005, Lepetit, et. al, [19] reviewed the status of monocular model based rigid body tracking. Since then, direct 3D tracking has become quite popular research area, but monocular model-based tracking should still not be forgotten. We mainly focus on tracking, which could be applied to aug- mented reality, but also some other applications are covered. Given the wide subject area this paper tries to give a broad view on the research that has been conducted, giving the reader an introduction to the different disciplines that are tightly related to model-based tracking. The work has been conducted by searching through well known academic search databases in a systematic manner, and by selecting certain publications for closer examination. We analyze the results by dividing the found papers into different categories by their way of implementation. The issues which have not yet been solved are discussed. We also discuss on emerging model-based methods such as fusing different types of features and region-based pose estimation which could show the way for future research in this subject.
Resumo:
The increased awareness and evolved consumer habits have set more demanding standards for the quality and safety control of food products. The production of foodstuffs which fulfill these standards can be hampered by different low-molecular weight contaminants. Such compounds can consist of, for example residues of antibiotics in animal use or mycotoxins. The extremely small size of the compounds has hindered the development of analytical methods suitable for routine use, and the methods currently in use require expensive instrumentation and qualified personnel to operate them. There is a need for new, cost-efficient and simple assay concepts which can be used for field testing and are capable of processing large sample quantities rapidly. Immunoassays have been considered as the golden standard for such rapid on-site screening methods. The introduction of directed antibody engineering and in vitro display technologies has facilitated the development of novel antibody based methods for the detection of low-molecular weight food contaminants. The primary aim of this study was to generate and engineer antibodies against low-molecular weight compounds found in various foodstuffs. The three antigen groups selected as targets of antibody development cause food safety and quality defects in wide range of products: 1) fluoroquinolones: a family of synthetic broad-spectrum antibacterial drugs used to treat wide range of human and animal infections, 2) deoxynivalenol: type B trichothecene mycotoxin, a widely recognized problem for crops and animal feeds globally, and 3) skatole, or 3-methyindole is one of the two compounds responsible for boar taint, found in the meat of monogastric animals. This study describes the generation and engineering of antibodies with versatile binding properties against low-molecular weight food contaminants, and the consecutive development of immunoassays for the detection of the respective compounds.
Resumo:
Kandidaatin tutkielma ”Hinnoittelustrategian valinta terästeollisuudessa – Case Teräsyhtiö Oy” käsittelee tuotantohyödykkeitä valmistavan hiiliteräsyhtiön hinnoittelustrategian valintaa, peilaamalla toteutuvaa hinnoittelua hinnoittelun teoreettisiin periaatteisiin. Tutkimuksen tavoitteeksi kohdentui selvittää, kuinka hyvin kohdeyrityksen hinnoittelu noudattaa hinnoittelun teoreettisia periaatteita. Tavoitteena oli myös selvittää, miten hinnoittelu toteutuu kohdeyrityksessä ja mitkä tekijät vaikuttavat tähän strategiseen päätökseen. Tutkielman teoriaosuuden muodostaa hinnan ja hinnoitteluprosessin teorian muodostama kokonaisuus yhdessä tutkielman varsinaisen viitekehyksen kanssa, jona toimivat kustannusperusteisen ja markkinalähtöisen hinnoittelun perinteiset mallit. Tutkielmassa markkinalähtöisillä malleilla viitataan kysyntä- ja kilpailulähtöisiin malleihin. Tutkimuksen aineisto kerättiin teema-haastattelun avulla, haastattelemalla kolmea Case-yrityksen hinnoittelussa toimivaa henkilöä. Tutkimus toteutui laadullisena tutkimuksena hyödyntäen analyysissä teorialähtöistä sisällönanalyysiä. Tutkimustulosten osalta tärkeään rooliin asettui kahtiajako kotimarkkinoihin ja kotimarkkinoiden ulkopuolisiin alueisiin. Nämä alueet määrittivät pitkälle sitä, jouduttiinko hinnoittelu toteuttamaan hintaa seuraten vai asettaen. Toimialan alueellisten hintatasojen osalta merkittävässä asemassa olivat teräsyhtiöiden avoimet hintalistat, jotka ohjaavat hinnoittelua vahvasti. Hinnoitteluprosessin osalta tärkeimmäksi tavoitteeksi asettui kannattavuuden takaaminen, sekä johdonmukaisen hinnoittelun harjoittaminen. Markkinalähtöistä hinnoittelua ohjaavista ulkoisista tekijöistä tärkeintä oli Kilpailulain (948/2011) huomioiminen. Asiakkaan rooli hinnoittelussa oli myös erittäin merkittävä. Tutkimus osoitti kohdeyrityksen hinnoittelun painottavan markkinalähtöisiä menetelmiä, huomioiden kuitenkin kustannusten vaikutus katteen kautta. Tutkimus osoitti myös, ettei asiakkaan kokemaa arvoa huomioida hinnoittelun pohjatyössä siinä määrin, kuin olisi mahdollisesti tarpeellista. Tutkimuksen johtopäätöksissä korostuu se, kuinka asiakasarvon huomioiminen voisi mahdollistaa yritykselle korkeamman tuloksellisuuden.
Resumo:
Tannins, typically segregated into two major groups, the hydrolyzable tannins (HTs) and the proanthocyanidins (PAs), are plant polyphenolic secondary metabolites found throughout the plant kingdom. On one hand, tannins may cause harmful nutritional effects on herbivores, for example insects, and hence they work as plants’ defense against plant-eating animals. On the other hand, they may affect positively some herbivores, such as mammals, for example by their antioxidant, antimicrobial, anti-inflammatory or anticarcinogenic activities. This thesis focuses on understanding the bioactivity of plant tannins, their anthelmintic properties and the tools used for the qualitative and quantitative analysis of this endless source of structural diversity. The first part of the experimental work focused on the development of ultra-high performance liquid chromatography−tandem mass spectrometry (UHPLC-MS/MS) based methods for the rapid fingerprint analysis of bioactive polyphenols, especially tannins. In the second part of the experimental work the in vitro activity of isolated and purified HTs and their hydrolysis product, gallic acid, was tested against egg hatching and larval motility of two larval developmental stages, L1 and L2, of a common ruminant gastrointestinal parasite, Haemonchus contortus. The results indicated clear relationships between the HT structure and the anthelmintic activity. The activity of the studied compounds depended on many structural features, including size, functional groups present in the structure, and the structural rigidness. To further understand tannin bioactivity on a molecular level, the interaction between bovine serum albumin (BSA), and seven HTs and epigallocatechin gallate was examined. The objective was to define the effect of pH on the formation on tannin–protein complexes and to evaluate the stability of the formed complexes by gel electrophoresis and MALDI-TOF-MS. The results indicated that more basic pH values had a stabilizing effect on the tannin–protein complexes and that the tannin oxidative activity was directly linked with their tendency to form covalently stabilized complexes with BSA at increased pH.
Resumo:
The phenomenon of communitas has been described as a moment 'in and out of time' in which a collective of individuals may be experienced by one as equal and individuated in an environment stripped of structural attributes (Turner, 1 969). In these moments, emotional bonds form and an experience of perceived 'oneness' and synergy may be described. As a result of the perceived value of these experiences, it has been suggested by Sharpe (2005) that more clearly understanding how this phenomenon may be purposefully facilitated would be beneficial for leisure service providers. Consequently, the purpose of this research endeavor was to examine the ways in which a particular leisure service provider systematically employs specific methods and sets specific parameters with the intention of guiding participants toward experiences associated with communitas or "shared spirit" as described by the organization. A qualitative case study taking a phenomenological approach was employed in order to capture the depth and complexity of both the phenomenon and the purposefiil negotiation of experiences in guiding participants toward this phenomenon. The means through which these experiences were intentionally facilitated was recreational music making in a group drumming context. As such, an organization which employs specific methods of rhythm circle facilitation as well as trains other facilitators all over the world was chosen purposely for their recognition as the most respectable and credible in this field. The specific facilitator was chosen based on high recommendation by the organization due to her level of experience and expertise. Two rhythm circles were held, and participants were chosen randomly by the facilitator. Data was collected through observation in the first circle and participant- observation in the second, as well as through focus groups with circle participants. Interviews with the facilitator were held both initially to gain broad understanding of concepts and phenomenon as well as after each circle to reflect on each circle specifically. Data was read repeatedly to drawn out patterns which emerged and were coded and organized accordingly. It was found that this specific process or system of implementation lead to experiences associated with communitas by participants. In order to more clearly understand this process and the ways in which experiences associated with communitas manifest as a result of deliberate facilitator actions, these objective facilitator actions were plotted along a continuum relating to subjective participant experiences. These findings were then linked to the literature with regards to specific characteristics of communitas. In so doing, the intentional manifestation of these experiences may be more clearly understood for ftiture facilitators in many contexts. Beyond this, findings summarized important considerations with regards to specific technical and communication competencies which were found to be essential to fostering these experiences for participants within each group. Findings surrounding the maintenance of a fluid negotiation of certain transition points within a group rhythm event overall were also highlighted, and this fluidity was found to be essential to the experience of absorption and engagement in the activity and experience. Emergent themes of structure, control, and consciousness have been presented as they manifested and were found to affect experiences within this study. Discussions surrounding the ethics and authenticity of these particular methods and their implementation has also been generated throughout. In conclusion, there was a breadth as well as depth of knowledge found in unpacking this complex process of guiding individuals toward experiences associated with communitas. The implications of these findings contribute in broadening the current theoretical as well as practical understanding as to how certain intentional parameters may be set and methods employed which may lead to experiences of communitas, and as well contribute a greater knowledge to conceptualizing the manifestation of these experiences when broken down.
Resumo:
Evidence suggests that children with developmental coordination disorder (DCD) have lower levels of cardiorespiratory fitness (CRF) compared to children without the condition. However, these studies were restricted to field-based methods in order to predict V02 peak in the determination of CRF. Such field tests have been criticised for their ability to provide a valid prediction of V02 peak and vulnerability to psychological aspects in children with DCD, such as low perceived adequacy toward physical activity. Moreover, the contribution of physical activity to the variance in V02 peak between the two groups is unknown. The purpose of our study was to determine the mediating role of physical activity and perceived adequacy towards physical activity on V02 peak in children with significant motor impairments. This prospective case-control design involved 122 (age 12-13 years) children with significant motor impairments (n=61) and healthy matched controls (n=61) based on age, gender and school location. Participants had been previously assessed for motor proficiency and classified as a probable DCD (p-DCD) or healthy control using the movement ABC test. V02 peak was measured by a progressive exercise test on a cycle ergometer. Perceived adequacy was measured using a 7 -item subscale from Children's Selfperception of Adequacy and Predilection for Physical Activity scale. Physical activity was monitored for seven days with the Actical® accelerometer. Children with p-DCD had significantly lower V02 peak (48.76±7.2 ml/ffm/min; p:50.05) compared to controls (53.12±8.2 ml/ffm/min), even after correcting for fat free mass. Regression analysis demonstrated that perceived adequacy and physical activity were significant mediators in the relationship between p-DCD and V02 peak. In conclusion, using a stringent laboratory assessment, the results of the current study verify the findings of earlier studies, adding low CRF to the list of health consequences associated with DCD. It seems that when testing for CRF in this population, there is a need to consider the psychological barriers associated with their condition. Moreover, strategies to increase physical activity in children with DCD may result in improvement in their CRF.